Sep 30 02:54:31 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 02:54:31 crc restorecon[4673]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:31 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 02:54:32 crc restorecon[4673]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 02:54:33 crc kubenswrapper[4744]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 02:54:33 crc kubenswrapper[4744]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 02:54:33 crc kubenswrapper[4744]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 02:54:33 crc kubenswrapper[4744]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 02:54:33 crc kubenswrapper[4744]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 02:54:33 crc kubenswrapper[4744]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.224933 4744 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235630 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235666 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235676 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235688 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235699 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235709 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235717 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235726 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235740 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235748 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235756 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235763 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235771 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235779 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235787 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235795 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235802 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235810 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235819 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235826 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235844 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235852 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235861 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235872 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235881 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235890 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235898 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235906 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235915 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235923 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235932 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235939 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235947 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235956 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235963 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235971 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235978 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235986 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.235993 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236001 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236009 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236017 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236025 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236036 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236047 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236057 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236067 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236076 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236085 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236095 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236102 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236111 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236121 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236132 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236141 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236150 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236161 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236171 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236180 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236188 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236196 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236203 4744 feature_gate.go:330] unrecognized feature gate: Example Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236211 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236281 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236290 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236299 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236307 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236314 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236324 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236333 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.236341 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237313 4744 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237339 4744 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237354 4744 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237395 4744 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237408 4744 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237418 4744 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237431 4744 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237443 4744 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237453 4744 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237463 4744 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237474 4744 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237484 4744 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237493 4744 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237502 4744 flags.go:64] FLAG: --cgroup-root="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237511 4744 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237521 4744 flags.go:64] FLAG: --client-ca-file="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237530 4744 flags.go:64] FLAG: --cloud-config="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237538 4744 flags.go:64] FLAG: --cloud-provider="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237548 4744 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237559 4744 flags.go:64] FLAG: --cluster-domain="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237568 4744 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237578 4744 flags.go:64] FLAG: --config-dir="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237587 4744 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237596 4744 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237609 4744 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237620 4744 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237631 4744 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237641 4744 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237651 4744 flags.go:64] FLAG: --contention-profiling="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237660 4744 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237671 4744 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237681 4744 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237690 4744 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237701 4744 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237710 4744 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237719 4744 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237728 4744 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237740 4744 flags.go:64] FLAG: --enable-server="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237749 4744 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237761 4744 flags.go:64] FLAG: --event-burst="100" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237771 4744 flags.go:64] FLAG: --event-qps="50" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237781 4744 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237790 4744 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237800 4744 flags.go:64] FLAG: --eviction-hard="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237814 4744 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237823 4744 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237832 4744 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237842 4744 flags.go:64] FLAG: --eviction-soft="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237851 4744 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237861 4744 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237870 4744 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237879 4744 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237888 4744 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237897 4744 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237907 4744 flags.go:64] FLAG: --feature-gates="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237918 4744 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237928 4744 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237938 4744 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237947 4744 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237957 4744 flags.go:64] FLAG: --healthz-port="10248" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237967 4744 flags.go:64] FLAG: --help="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237976 4744 flags.go:64] FLAG: --hostname-override="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237985 4744 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.237995 4744 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238004 4744 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238013 4744 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238021 4744 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238030 4744 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238039 4744 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238049 4744 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238058 4744 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238068 4744 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238078 4744 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238089 4744 flags.go:64] FLAG: --kube-reserved="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238100 4744 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238110 4744 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238119 4744 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238128 4744 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238138 4744 flags.go:64] FLAG: --lock-file="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238147 4744 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238156 4744 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238165 4744 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238179 4744 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238189 4744 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238198 4744 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238208 4744 flags.go:64] FLAG: --logging-format="text" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238217 4744 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238227 4744 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238236 4744 flags.go:64] FLAG: --manifest-url="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238245 4744 flags.go:64] FLAG: --manifest-url-header="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238257 4744 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238267 4744 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238278 4744 flags.go:64] FLAG: --max-pods="110" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238287 4744 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238297 4744 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238306 4744 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238315 4744 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238325 4744 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238334 4744 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238343 4744 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238364 4744 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238396 4744 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238406 4744 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238415 4744 flags.go:64] FLAG: --pod-cidr="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238424 4744 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238439 4744 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238448 4744 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238458 4744 flags.go:64] FLAG: --pods-per-core="0" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238467 4744 flags.go:64] FLAG: --port="10250" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238477 4744 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238486 4744 flags.go:64] FLAG: --provider-id="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238495 4744 flags.go:64] FLAG: --qos-reserved="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238504 4744 flags.go:64] FLAG: --read-only-port="10255" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238513 4744 flags.go:64] FLAG: --register-node="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238523 4744 flags.go:64] FLAG: --register-schedulable="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238531 4744 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238547 4744 flags.go:64] FLAG: --registry-burst="10" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238556 4744 flags.go:64] FLAG: --registry-qps="5" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238566 4744 flags.go:64] FLAG: --reserved-cpus="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238575 4744 flags.go:64] FLAG: --reserved-memory="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238587 4744 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238597 4744 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238606 4744 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238615 4744 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238625 4744 flags.go:64] FLAG: --runonce="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238633 4744 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238643 4744 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238654 4744 flags.go:64] FLAG: --seccomp-default="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238664 4744 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238674 4744 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238683 4744 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238693 4744 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238703 4744 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238712 4744 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238720 4744 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238730 4744 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238739 4744 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238748 4744 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238758 4744 flags.go:64] FLAG: --system-cgroups="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238767 4744 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238781 4744 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238791 4744 flags.go:64] FLAG: --tls-cert-file="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238800 4744 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238811 4744 flags.go:64] FLAG: --tls-min-version="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238820 4744 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238831 4744 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238840 4744 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238849 4744 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238858 4744 flags.go:64] FLAG: --v="2" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238870 4744 flags.go:64] FLAG: --version="false" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238883 4744 flags.go:64] FLAG: --vmodule="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238903 4744 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.238913 4744 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239111 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239122 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239130 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239139 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239147 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239155 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239163 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239171 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239179 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239188 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239196 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239204 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239212 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239220 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239228 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239237 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239244 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239252 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239261 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239269 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239278 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239289 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239297 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239305 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239313 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239321 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239329 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239336 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239345 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239354 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239362 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239391 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239400 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239408 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239416 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239424 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239432 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239439 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239448 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239457 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239464 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239472 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239480 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239488 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239496 4744 feature_gate.go:330] unrecognized feature gate: Example Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239503 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239511 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239522 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239532 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239540 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239552 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239561 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239569 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239577 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239587 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239597 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239607 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239617 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239626 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239635 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239643 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239651 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239659 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239667 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239676 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239684 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239693 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239701 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239708 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239718 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.239727 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.240516 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.252720 4744 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.252784 4744 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.252924 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.252945 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.252954 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.252963 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.252972 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.252980 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.252991 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253003 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253013 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253023 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253032 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253040 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253048 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253060 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253071 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253081 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253090 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253099 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253107 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253116 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253124 4744 feature_gate.go:330] unrecognized feature gate: Example Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253134 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253142 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253150 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253159 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253168 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253177 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253187 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253195 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253204 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253212 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253220 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253228 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253236 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253244 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253252 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253260 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253268 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253276 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253284 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253291 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253299 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253307 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253315 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253323 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253331 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253338 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253346 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253355 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253362 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253392 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253400 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253409 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253417 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253425 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253433 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253440 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253449 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253458 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253467 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253475 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253483 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253491 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253499 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253507 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253515 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253523 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253533 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253543 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253552 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253563 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.253578 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253808 4744 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253822 4744 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253830 4744 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253838 4744 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253846 4744 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253854 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253862 4744 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253870 4744 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253878 4744 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253886 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253893 4744 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253901 4744 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253909 4744 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253917 4744 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253925 4744 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253932 4744 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253940 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253948 4744 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253958 4744 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253968 4744 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253979 4744 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253989 4744 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.253998 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254007 4744 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254015 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254023 4744 feature_gate.go:330] unrecognized feature gate: Example Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254031 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254040 4744 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254050 4744 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254060 4744 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254069 4744 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254077 4744 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254086 4744 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254094 4744 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254103 4744 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254111 4744 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254121 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254129 4744 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254138 4744 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254145 4744 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254153 4744 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254161 4744 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254168 4744 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254176 4744 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254184 4744 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254192 4744 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254200 4744 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254208 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254216 4744 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254224 4744 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254232 4744 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254240 4744 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254248 4744 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254256 4744 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254264 4744 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254271 4744 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254279 4744 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254287 4744 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254295 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254305 4744 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254314 4744 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254322 4744 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254334 4744 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254343 4744 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254352 4744 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254360 4744 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254388 4744 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254396 4744 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254404 4744 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254413 4744 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.254420 4744 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.254433 4744 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.255705 4744 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.261501 4744 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.261642 4744 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.264196 4744 server.go:997] "Starting client certificate rotation" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.264252 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.265054 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-17 08:01:21.446410035 +0000 UTC Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.265161 4744 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1157h6m48.181254717s for next certificate rotation Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.295663 4744 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.297522 4744 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.327478 4744 log.go:25] "Validated CRI v1 runtime API" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.366506 4744 log.go:25] "Validated CRI v1 image API" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.368987 4744 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.377740 4744 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-02-49-42-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.377780 4744 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.395242 4744 manager.go:217] Machine: {Timestamp:2025-09-30 02:54:33.390078314 +0000 UTC m=+0.563298318 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799886 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:02f345d7-bf31-49d0-b2d4-5371ee59f26c BootID:ace33109-5427-4ec8-95ec-e0c80b341759 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ce:54:18 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ce:54:18 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:aa:e7:4c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:52:b0:28 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:46:87:99 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:44:cf:2f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9e:d7:69:31:48:b1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:b6:8a:ef:58:95 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.395819 4744 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.396027 4744 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.398221 4744 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.398585 4744 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.398640 4744 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.398970 4744 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.398991 4744 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.400032 4744 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.400091 4744 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.400513 4744 state_mem.go:36] "Initialized new in-memory state store" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.400667 4744 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.406179 4744 kubelet.go:418] "Attempting to sync node with API server" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.406231 4744 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.406288 4744 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.406311 4744 kubelet.go:324] "Adding apiserver pod source" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.406329 4744 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.411852 4744 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.413854 4744 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.414047 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.414124 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.414191 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.414219 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.416233 4744 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.417975 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418073 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418089 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418103 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418123 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418138 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418151 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418173 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418190 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418248 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418319 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418335 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.418397 4744 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.419199 4744 server.go:1280] "Started kubelet" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.420306 4744 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.420327 4744 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.420453 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.420852 4744 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 02:54:33 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.422704 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.422778 4744 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.422818 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 19:53:25.314955709 +0000 UTC Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.422934 4744 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1744h58m51.892028876s for next certificate rotation Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.423471 4744 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.423490 4744 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.423681 4744 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.423799 4744 server.go:460] "Adding debug handlers to kubelet server" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.428868 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.430449 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="200ms" Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.430413 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.442081 4744 factory.go:55] Registering systemd factory Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.442571 4744 factory.go:221] Registration of the systemd container factory successfully Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.430320 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869efdfacb4acd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 02:54:33.419156695 +0000 UTC m=+0.592376689,LastTimestamp:2025-09-30 02:54:33.419156695 +0000 UTC m=+0.592376689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.443905 4744 factory.go:153] Registering CRI-O factory Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.443969 4744 factory.go:221] Registration of the crio container factory successfully Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.444074 4744 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.444099 4744 factory.go:103] Registering Raw factory Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.444119 4744 manager.go:1196] Started watching for new ooms in manager Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.445716 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.450740 4744 manager.go:319] Starting recovery of all containers Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453670 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453746 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453764 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453779 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453793 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453807 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453823 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453838 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453859 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453873 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453887 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453901 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453918 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453940 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453967 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453982 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.453995 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454008 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454025 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454067 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454138 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454158 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454178 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454197 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454215 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454267 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454293 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454313 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454333 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454389 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454412 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454430 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454445 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454461 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454478 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454504 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454519 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454534 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454550 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454565 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454581 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454596 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454611 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454644 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454660 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454675 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454690 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454706 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454721 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454738 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454752 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454768 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454789 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454806 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454823 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454839 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454854 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454870 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454885 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454901 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454918 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454932 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454949 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454964 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454978 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.454994 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455009 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455026 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455041 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455055 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455069 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455084 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455098 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455114 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455127 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455144 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455159 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455172 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455186 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455203 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455217 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455233 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455250 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455265 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455280 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455294 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455309 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455324 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455339 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455354 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455391 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455412 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455429 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455444 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455461 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455476 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455492 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455507 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455521 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455538 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455553 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455570 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455590 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455609 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455639 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455661 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455680 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455697 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455713 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455728 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455763 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455779 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455796 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455811 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455826 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455839 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455853 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455869 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455882 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455896 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455911 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455926 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455939 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455956 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455970 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455983 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.455999 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456016 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456034 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456048 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456063 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456077 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456091 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456106 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456119 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456133 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456147 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456163 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456176 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456191 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456205 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456219 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456235 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456248 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456262 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456275 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456291 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456308 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456322 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456337 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456351 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456389 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456414 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456433 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456452 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456468 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456483 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456497 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456513 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456528 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456547 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456561 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456574 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456591 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456607 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456622 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456636 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456651 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456665 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456683 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456697 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456712 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456725 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456739 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456755 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456769 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456783 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456798 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456813 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456884 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456908 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456926 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456944 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.456990 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460409 4744 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460452 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460473 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460496 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460518 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460538 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460561 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460580 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460602 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460623 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460644 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460665 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460686 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460707 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460727 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460748 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460772 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460792 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460812 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460834 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460854 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460875 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460897 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460917 4744 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460936 4744 reconstruct.go:97] "Volume reconstruction finished" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.460949 4744 reconciler.go:26] "Reconciler: start to sync state" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.477860 4744 manager.go:324] Recovery completed Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.490804 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.492698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.492736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.492749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.494027 4744 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.494041 4744 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.494061 4744 state_mem.go:36] "Initialized new in-memory state store" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.496809 4744 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.499976 4744 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.501526 4744 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.501691 4744 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.502267 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.502349 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.502443 4744 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.511134 4744 policy_none.go:49] "None policy: Start" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.512306 4744 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.512413 4744 state_mem.go:35] "Initializing new in-memory state store" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.529143 4744 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.566674 4744 manager.go:334] "Starting Device Plugin manager" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.566961 4744 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.566995 4744 server.go:79] "Starting device plugin registration server" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.567812 4744 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.567842 4744 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.569073 4744 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.569225 4744 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.569246 4744 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.585933 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.603185 4744 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.603336 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.605261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.605315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.605330 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.605547 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.605824 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.605893 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.606973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.606997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.607007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.607147 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.607209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.607242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.607257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.607532 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.607564 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608082 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608129 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608282 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608424 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608459 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.608952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.609613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.609653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.609666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.609780 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.610086 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.610113 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.610471 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.610488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.610496 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611158 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611185 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.611522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.612115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.612173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.612191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.631164 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="400ms" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.663284 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.663458 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.663564 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.663708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.663806 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.663906 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664013 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664197 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664414 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664538 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664627 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664815 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.664854 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.668604 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.670186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.670262 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.670291 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.670347 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.671295 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767134 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767339 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767475 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767539 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767587 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767671 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767621 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767703 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767615 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767777 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767676 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767814 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767836 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767918 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767927 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767972 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.767985 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768075 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768111 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768255 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768278 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768283 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768145 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.768316 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.872192 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.873782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.873855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.873876 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.873920 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 02:54:33 crc kubenswrapper[4744]: E0930 02:54:33.874465 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.931611 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.955669 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.979127 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6ce97f8fc0942eef2c7dec260edf7350271241ea8b2d4a57f5e3e2d7e69148f6 WatchSource:0}: Error finding container 6ce97f8fc0942eef2c7dec260edf7350271241ea8b2d4a57f5e3e2d7e69148f6: Status 404 returned error can't find the container with id 6ce97f8fc0942eef2c7dec260edf7350271241ea8b2d4a57f5e3e2d7e69148f6 Sep 30 02:54:33 crc kubenswrapper[4744]: I0930 02:54:33.984262 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:33 crc kubenswrapper[4744]: W0930 02:54:33.993755 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d02fba11d7b831e07ec9dc0844535f0f71c39b42ea99a790178518b803faa0bf WatchSource:0}: Error finding container d02fba11d7b831e07ec9dc0844535f0f71c39b42ea99a790178518b803faa0bf: Status 404 returned error can't find the container with id d02fba11d7b831e07ec9dc0844535f0f71c39b42ea99a790178518b803faa0bf Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.009395 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:34 crc kubenswrapper[4744]: W0930 02:54:34.011291 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8bf5c6ec7dec8f0ded412066a5df95b469a7bbfe62d66b4ca7148cf746df0045 WatchSource:0}: Error finding container 8bf5c6ec7dec8f0ded412066a5df95b469a7bbfe62d66b4ca7148cf746df0045: Status 404 returned error can't find the container with id 8bf5c6ec7dec8f0ded412066a5df95b469a7bbfe62d66b4ca7148cf746df0045 Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.015123 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.032773 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="800ms" Sep 30 02:54:34 crc kubenswrapper[4744]: W0930 02:54:34.038736 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2649c9339d5bac6fd0632c44b00b427eaeffb0cd2ae95f02b5dd816c202a03de WatchSource:0}: Error finding container 2649c9339d5bac6fd0632c44b00b427eaeffb0cd2ae95f02b5dd816c202a03de: Status 404 returned error can't find the container with id 2649c9339d5bac6fd0632c44b00b427eaeffb0cd2ae95f02b5dd816c202a03de Sep 30 02:54:34 crc kubenswrapper[4744]: W0930 02:54:34.044403 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-06c34a4a379bd2fc13dfea666669d680c576304cc62e716fe1ba3b34ecfe751c WatchSource:0}: Error finding container 06c34a4a379bd2fc13dfea666669d680c576304cc62e716fe1ba3b34ecfe751c: Status 404 returned error can't find the container with id 06c34a4a379bd2fc13dfea666669d680c576304cc62e716fe1ba3b34ecfe751c Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.275256 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.277325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.277429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.277448 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.277488 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.278118 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.422171 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.510198 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"06c34a4a379bd2fc13dfea666669d680c576304cc62e716fe1ba3b34ecfe751c"} Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.512096 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2649c9339d5bac6fd0632c44b00b427eaeffb0cd2ae95f02b5dd816c202a03de"} Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.513167 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8bf5c6ec7dec8f0ded412066a5df95b469a7bbfe62d66b4ca7148cf746df0045"} Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.515498 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d02fba11d7b831e07ec9dc0844535f0f71c39b42ea99a790178518b803faa0bf"} Sep 30 02:54:34 crc kubenswrapper[4744]: I0930 02:54:34.517046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6ce97f8fc0942eef2c7dec260edf7350271241ea8b2d4a57f5e3e2d7e69148f6"} Sep 30 02:54:34 crc kubenswrapper[4744]: W0930 02:54:34.731509 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.732057 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.753888 4744 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869efdfacb4acd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 02:54:33.419156695 +0000 UTC m=+0.592376689,LastTimestamp:2025-09-30 02:54:33.419156695 +0000 UTC m=+0.592376689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.833465 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="1.6s" Sep 30 02:54:34 crc kubenswrapper[4744]: W0930 02:54:34.838303 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.838421 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:34 crc kubenswrapper[4744]: W0930 02:54:34.930745 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.930922 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:34 crc kubenswrapper[4744]: W0930 02:54:34.951048 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:34 crc kubenswrapper[4744]: E0930 02:54:34.951135 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.079158 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.081989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.082047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.082057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.082089 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 02:54:35 crc kubenswrapper[4744]: E0930 02:54:35.082796 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.421726 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.525498 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.525500 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.525575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.525597 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.525616 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.526529 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.526587 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.526606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.529522 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c" exitCode=0 Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.529766 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.530334 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.531347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.531408 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.531418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.532408 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22" exitCode=0 Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.532487 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.532583 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.533582 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.533604 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.533613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.535156 4744 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76" exitCode=0 Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.535201 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.535270 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.536180 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.536198 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.536207 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.538329 4744 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3" exitCode=0 Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.538407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3"} Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.538573 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.540034 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.540783 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.540819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.540836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.541189 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.541260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.541282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:35 crc kubenswrapper[4744]: I0930 02:54:35.689890 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.422492 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:36 crc kubenswrapper[4744]: E0930 02:54:36.434548 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="3.2s" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.548078 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.548158 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.548177 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.548403 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.549968 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.550015 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.550033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.552752 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.552810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.552825 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.552837 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.554461 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913" exitCode=0 Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.554513 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.555389 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.556390 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.556413 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.556422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:36 crc kubenswrapper[4744]: W0930 02:54:36.560304 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Sep 30 02:54:36 crc kubenswrapper[4744]: E0930 02:54:36.560441 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.560559 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.560579 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b9e9fce98fe817bbde50eb98e6f41741d185892ce2526f72a172ff9fd85e7d4a"} Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.560565 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.562311 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.562410 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.562426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.562623 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.562672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.562688 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.682917 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.684410 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.684444 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.684457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:36 crc kubenswrapper[4744]: I0930 02:54:36.684481 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 02:54:36 crc kubenswrapper[4744]: E0930 02:54:36.684878 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.566661 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.566668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e"} Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.570014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.570144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.570258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.573017 4744 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37" exitCode=0 Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.573161 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.573478 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37"} Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.573619 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.573702 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.573623 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.573811 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.574236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.574321 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.574412 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.574720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.574772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.574791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.574979 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.575003 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.575016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.575593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.575629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.575643 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:37 crc kubenswrapper[4744]: I0930 02:54:37.876555 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.311860 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.580853 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab"} Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.580904 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.580920 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08"} Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.580944 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c"} Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.581013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da"} Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.580990 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.580955 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.581024 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582711 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582759 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582774 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582763 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582868 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582909 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:38 crc kubenswrapper[4744]: I0930 02:54:38.582934 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.589324 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4"} Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.589360 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.589436 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.589484 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.591040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.591073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.591085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.591137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.591168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.591185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.885918 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.887092 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.887131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.887153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:39 crc kubenswrapper[4744]: I0930 02:54:39.887190 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 02:54:40 crc kubenswrapper[4744]: I0930 02:54:40.591477 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:40 crc kubenswrapper[4744]: I0930 02:54:40.592516 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:40 crc kubenswrapper[4744]: I0930 02:54:40.592543 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:40 crc kubenswrapper[4744]: I0930 02:54:40.592552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.198765 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.199063 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.201308 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.201395 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.201419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.211529 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.595257 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.596573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.596617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.596630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.914215 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.914389 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.914423 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.915846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.915923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:41 crc kubenswrapper[4744]: I0930 02:54:41.915952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:42 crc kubenswrapper[4744]: I0930 02:54:42.230790 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 02:54:42 crc kubenswrapper[4744]: I0930 02:54:42.231042 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:42 crc kubenswrapper[4744]: I0930 02:54:42.232555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:42 crc kubenswrapper[4744]: I0930 02:54:42.232697 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:42 crc kubenswrapper[4744]: I0930 02:54:42.232796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.127183 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.127472 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.128790 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.128844 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.128861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:43 crc kubenswrapper[4744]: E0930 02:54:43.586077 4744 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.769141 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.769430 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.771185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.771249 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.771267 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.889157 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.889437 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.891081 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.891237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:43 crc kubenswrapper[4744]: I0930 02:54:43.891303 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:46 crc kubenswrapper[4744]: I0930 02:54:46.128280 4744 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 02:54:46 crc kubenswrapper[4744]: I0930 02:54:46.128518 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 02:54:47 crc kubenswrapper[4744]: W0930 02:54:47.363941 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.364113 4744 trace.go:236] Trace[1089477436]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 02:54:37.361) (total time: 10002ms): Sep 30 02:54:47 crc kubenswrapper[4744]: Trace[1089477436]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (02:54:47.363) Sep 30 02:54:47 crc kubenswrapper[4744]: Trace[1089477436]: [10.002185839s] [10.002185839s] END Sep 30 02:54:47 crc kubenswrapper[4744]: E0930 02:54:47.364158 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.422506 4744 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 02:54:47 crc kubenswrapper[4744]: W0930 02:54:47.506699 4744 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.506787 4744 trace.go:236] Trace[1076634371]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 02:54:37.505) (total time: 10001ms): Sep 30 02:54:47 crc kubenswrapper[4744]: Trace[1076634371]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (02:54:47.506) Sep 30 02:54:47 crc kubenswrapper[4744]: Trace[1076634371]: [10.001735574s] [10.001735574s] END Sep 30 02:54:47 crc kubenswrapper[4744]: E0930 02:54:47.506813 4744 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.637419 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.637474 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.643529 4744 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.643628 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.881829 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.882050 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.883684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.883730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:47 crc kubenswrapper[4744]: I0930 02:54:47.883746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:51 crc kubenswrapper[4744]: I0930 02:54:51.601747 4744 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 02:54:51 crc kubenswrapper[4744]: I0930 02:54:51.801751 4744 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 02:54:51 crc kubenswrapper[4744]: I0930 02:54:51.924677 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:51 crc kubenswrapper[4744]: I0930 02:54:51.932663 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.270262 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.285586 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.417090 4744 apiserver.go:52] "Watching apiserver" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.422108 4744 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.422803 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.423574 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.423655 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.423766 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.424082 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.424287 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.424505 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.425402 4744 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.425543 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.425655 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.425676 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.427645 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.427725 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.427811 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.427808 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.428539 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.428724 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.429215 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.430103 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.437435 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.464273 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.480730 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.505618 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.518523 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.538180 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.553137 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.570773 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.588047 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.598614 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.609819 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.640501 4744 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.642035 4744 trace.go:236] Trace[1058593511]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 02:54:37.938) (total time: 14703ms): Sep 30 02:54:52 crc kubenswrapper[4744]: Trace[1058593511]: ---"Objects listed" error: 14703ms (02:54:52.641) Sep 30 02:54:52 crc kubenswrapper[4744]: Trace[1058593511]: [14.703846179s] [14.703846179s] END Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.642063 4744 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.644034 4744 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.644932 4744 trace.go:236] Trace[1964013755]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 02:54:40.190) (total time: 12453ms): Sep 30 02:54:52 crc kubenswrapper[4744]: Trace[1964013755]: ---"Objects listed" error: 12453ms (02:54:52.643) Sep 30 02:54:52 crc kubenswrapper[4744]: Trace[1964013755]: [12.453313985s] [12.453313985s] END Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.644971 4744 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.645127 4744 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.649491 4744 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.652262 4744 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.681806 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.694501 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.717085 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.729828 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.744774 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.744823 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.744858 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.744904 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.744926 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.744950 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745038 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745079 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745138 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745168 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745197 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745227 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745258 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745287 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745356 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745407 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745453 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745476 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745496 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745518 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745540 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745616 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745637 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745659 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745700 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745719 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745739 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745760 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745782 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745803 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745827 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745854 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745892 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746921 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745786 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.745807 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.745911 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:54:53.245888127 +0000 UTC m=+20.419108101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748086 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748124 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748145 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748165 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748182 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748199 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748215 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748266 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748283 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748299 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748315 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748330 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748347 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748362 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748397 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748412 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748427 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748445 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748463 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748480 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748496 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748511 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748526 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748540 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748558 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748574 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748592 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748616 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748649 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748666 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748681 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748697 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748713 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748734 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748753 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748772 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.748816 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749185 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749201 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749216 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746021 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746039 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746124 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746225 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746268 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746470 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746484 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746616 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746677 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746787 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.746939 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.747123 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.747077 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749233 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749592 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749632 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749663 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749697 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749725 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749753 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749783 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749809 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749837 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749864 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749893 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749944 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749974 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.749997 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750017 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750038 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750064 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750090 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750111 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750134 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750160 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750188 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750216 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750245 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750271 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750298 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750323 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750347 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750395 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750420 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750448 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750474 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750501 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750525 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750560 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750585 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750609 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750665 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750688 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750714 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750738 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750759 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750780 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750800 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750821 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750846 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750867 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750889 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750918 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750942 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750975 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.750998 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751046 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751069 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751092 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751124 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751146 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751170 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751196 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751218 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751242 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751265 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751287 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751310 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751330 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751384 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751410 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751432 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751457 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751478 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751500 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751526 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.751549 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.752484 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.752555 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.752604 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.752910 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.753627 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.753708 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.753852 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.754009 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.754239 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.754253 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.754503 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.754565 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.754994 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.755073 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.755942 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.756055 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.756462 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.756534 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.756703 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.757103 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.757321 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.758599 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.758711 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.758746 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.759255 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.759597 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.760306 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.760399 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.760408 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.760650 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.761111 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.761131 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.761464 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.761673 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.761909 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762045 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762067 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762236 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762310 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762534 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762748 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762739 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762830 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.762976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763016 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763069 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763226 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763473 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763488 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763678 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763871 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.763894 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.764117 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.764175 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.764668 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.764963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.767498 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.767823 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.767928 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.767990 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.768127 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.768386 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.768526 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.768585 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.768982 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.769573 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.769605 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.770224 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.770634 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.770661 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.770693 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.770831 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.770867 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.770920 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771361 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771392 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771357 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771566 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771638 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771682 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771743 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771756 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771832 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771869 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771907 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771971 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772001 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.771999 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772031 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772057 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772083 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772112 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772240 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772303 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772312 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772362 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772396 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772859 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772866 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.772982 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.773125 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.773160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.773550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.773818 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.774191 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.774197 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.768835 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.774239 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.774466 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.774304 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.775067 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.775609 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.775753 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.775840 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.776474 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.776331 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.776967 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.777460 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.776712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.777791 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.784117 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.789950 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.789990 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.790193 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.790455 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.790735 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.790741 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.790807 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.790855 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.793274 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.793114 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.793326 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.793547 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.793872 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.793911 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.794017 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.794052 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.794117 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.794461 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.794818 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.794947 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795171 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795267 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795309 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795318 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795341 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795392 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795419 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795443 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795469 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795493 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795555 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795586 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795610 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795637 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795661 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795685 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795709 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795734 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795758 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795784 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795812 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795838 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795861 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795919 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795954 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796009 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796076 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796112 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796141 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796175 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796228 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796253 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796276 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796300 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796450 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796469 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796484 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796498 4744 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796511 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796524 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796538 4744 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796551 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796564 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796577 4744 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796590 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796603 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796616 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796631 4744 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796643 4744 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796656 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796670 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796685 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796700 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796716 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796757 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796772 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796786 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796799 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796814 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796826 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796839 4744 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796857 4744 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796870 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796883 4744 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796898 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796914 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796927 4744 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796941 4744 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796955 4744 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796967 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796979 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796994 4744 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797006 4744 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797019 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797033 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797046 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797058 4744 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797070 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797084 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797095 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797107 4744 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797119 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797132 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797146 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797160 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797173 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797185 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797196 4744 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797209 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797223 4744 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797235 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797249 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797263 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797275 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797288 4744 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797302 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797316 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797328 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797339 4744 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797354 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797398 4744 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797413 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797426 4744 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797439 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797943 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.795738 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796054 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796263 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796826 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.796880 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.797013 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.798526 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:53.298492977 +0000 UTC m=+20.471713151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.798563 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.799829 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.799974 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.800287 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.800445 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.800485 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.800539 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:53.30052145 +0000 UTC m=+20.473741424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.800633 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.800638 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.801092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.804115 4744 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.808757 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.808995 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.809255 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797257 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797279 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797338 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797364 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797875 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797910 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797916 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.797980 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.798201 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.798290 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.809821 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.809883 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811506 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811749 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811767 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811781 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811794 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811806 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811819 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811831 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811844 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811860 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811873 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811886 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811912 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811925 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811938 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811951 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811960 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811969 4744 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811978 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811987 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.811999 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812010 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812020 4744 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812033 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812044 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812054 4744 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812063 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812072 4744 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812081 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812090 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812100 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812110 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812122 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812131 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812142 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812151 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812159 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812168 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812179 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812190 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812202 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812213 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812225 4744 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812236 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812247 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812256 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812265 4744 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812276 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812284 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812293 4744 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812302 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812310 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812322 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812335 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812348 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812358 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812387 4744 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812402 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812411 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812421 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812433 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812442 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812451 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812459 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812467 4744 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812478 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812487 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.812500 4744 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.815658 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.809148 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.816977 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.817318 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.817332 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.821731 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.822884 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.827994 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.830801 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.831097 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.831963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.834624 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.834841 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.835053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.837771 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.838714 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.842186 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842538 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842563 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842578 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.842575 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842640 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:53.342622415 +0000 UTC m=+20.515842379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842665 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842679 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842691 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:52 crc kubenswrapper[4744]: E0930 02:54:52.842745 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:53.342724218 +0000 UTC m=+20.515944272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.843176 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.850462 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.851943 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.852269 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.852618 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.854000 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.855414 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.857044 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.857308 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.859732 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.863534 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.864009 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.864485 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.879732 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.889065 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.891278 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.893399 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.893856 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913554 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913602 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913643 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913655 4744 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913664 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913674 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913682 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913691 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913700 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913709 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913718 4744 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913727 4744 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913737 4744 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913745 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913754 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913762 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913771 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913782 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913791 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913800 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913809 4744 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913818 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913826 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913834 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913842 4744 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913851 4744 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913861 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913870 4744 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913878 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913887 4744 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913895 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913903 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913912 4744 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913920 4744 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913928 4744 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913936 4744 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913946 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913955 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913964 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913972 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913980 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.913991 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914001 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914009 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914017 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914024 4744 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914032 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914042 4744 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914050 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914058 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914068 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914077 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914084 4744 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914092 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914100 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914330 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:52 crc kubenswrapper[4744]: I0930 02:54:52.914833 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.051133 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 02:54:53 crc kubenswrapper[4744]: W0930 02:54:53.064168 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-cc5b6582c82d959ca569b7950e8a1243ff49bbb44a8274ede6fcc8fb30b2d2d5 WatchSource:0}: Error finding container cc5b6582c82d959ca569b7950e8a1243ff49bbb44a8274ede6fcc8fb30b2d2d5: Status 404 returned error can't find the container with id cc5b6582c82d959ca569b7950e8a1243ff49bbb44a8274ede6fcc8fb30b2d2d5 Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.065839 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.075049 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 02:54:53 crc kubenswrapper[4744]: W0930 02:54:53.079114 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9272bdb78b7d1e9db2ae80fc2e06e607c6ef55be7c7f215cf346d547902dde23 WatchSource:0}: Error finding container 9272bdb78b7d1e9db2ae80fc2e06e607c6ef55be7c7f215cf346d547902dde23: Status 404 returned error can't find the container with id 9272bdb78b7d1e9db2ae80fc2e06e607c6ef55be7c7f215cf346d547902dde23 Sep 30 02:54:53 crc kubenswrapper[4744]: W0930 02:54:53.096574 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0f2de4ba9e0fb8ca915d7beb4f18ca460863a264f9700234bef7b3cd76dcbc87 WatchSource:0}: Error finding container 0f2de4ba9e0fb8ca915d7beb4f18ca460863a264f9700234bef7b3cd76dcbc87: Status 404 returned error can't find the container with id 0f2de4ba9e0fb8ca915d7beb4f18ca460863a264f9700234bef7b3cd76dcbc87 Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.161648 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.174858 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.176972 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.185756 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.206263 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.211484 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.231128 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.247690 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.265511 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.276790 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.291693 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.312549 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.320677 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.320797 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.320852 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.320924 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:54:54.320882827 +0000 UTC m=+21.494102811 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.320994 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.321015 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.321111 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:54.321073213 +0000 UTC m=+21.494304367 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.322400 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:54.322018612 +0000 UTC m=+21.495238606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.329288 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.339976 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.356831 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.366417 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.378860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.393346 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.404198 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.414667 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.422081 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.422148 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422304 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422320 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422341 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422346 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422359 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422381 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422453 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:54.422437504 +0000 UTC m=+21.595657468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:53 crc kubenswrapper[4744]: E0930 02:54:53.422477 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:54.422468895 +0000 UTC m=+21.595688869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.508821 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.509873 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.511544 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.512468 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.514154 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.515092 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.516025 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.516426 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.517566 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.518536 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.520246 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.521097 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.523588 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.524302 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.525071 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.526767 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.529284 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.531038 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.531644 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.532274 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.533003 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.533638 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.534330 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.535970 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.536764 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.537755 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.538475 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.539699 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.540240 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.541283 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.541568 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.542157 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.542716 4744 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.543322 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.545642 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.546480 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.547685 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.550301 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.551231 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.552474 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.553297 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.554764 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.555679 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.556433 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.558105 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.559683 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.560312 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.561170 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.561797 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.562459 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.563344 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.564558 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.565184 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.566407 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.567838 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.569417 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.570111 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.590220 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.600256 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.611602 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.622067 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.635597 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.642020 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb"} Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.642076 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c"} Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.642086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9272bdb78b7d1e9db2ae80fc2e06e607c6ef55be7c7f215cf346d547902dde23"} Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.643623 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19"} Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.643670 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cc5b6582c82d959ca569b7950e8a1243ff49bbb44a8274ede6fcc8fb30b2d2d5"} Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.644783 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0f2de4ba9e0fb8ca915d7beb4f18ca460863a264f9700234bef7b3cd76dcbc87"} Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.649007 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.667035 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.680351 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.690947 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.700264 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.712248 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.720625 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.731275 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.742289 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:53 crc kubenswrapper[4744]: I0930 02:54:53.755494 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.330624 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.330745 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.330790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.330866 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.330922 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:56.330903729 +0000 UTC m=+23.504123723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.331312 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:54:56.331299191 +0000 UTC m=+23.504519185 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.331543 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.331728 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:56.331708313 +0000 UTC m=+23.504928317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.431933 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.431980 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432089 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432103 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432113 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432142 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432184 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432198 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432163 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:56.432146416 +0000 UTC m=+23.605366390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.432264 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:54:56.432245629 +0000 UTC m=+23.605465603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.503184 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.503294 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:54 crc kubenswrapper[4744]: I0930 02:54:54.503297 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.504005 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.504142 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:54:54 crc kubenswrapper[4744]: E0930 02:54:54.504220 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.350644 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.351088 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:55:00.351021144 +0000 UTC m=+27.524241178 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.351546 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.351694 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.351842 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.351936 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:00.351919252 +0000 UTC m=+27.525139266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.351859 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.352052 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:00.352025326 +0000 UTC m=+27.525245300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.453143 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.453236 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453411 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453420 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453501 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453534 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453452 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453650 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:00.453614704 +0000 UTC m=+27.626834718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453654 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.453728 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:00.453714367 +0000 UTC m=+27.626934391 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.503016 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.503055 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.503016 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.503143 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.503218 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:54:56 crc kubenswrapper[4744]: E0930 02:54:56.503280 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.653954 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8"} Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.672740 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.686262 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.699257 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.714331 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.726405 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.760412 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.785771 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.800131 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:56 crc kubenswrapper[4744]: I0930 02:54:56.815598 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:56Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.240845 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-c8f22"] Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.241106 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.248196 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.248234 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.248490 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.265317 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/739887b5-daec-4bee-944e-d718b3baebc5-hosts-file\") pod \"node-resolver-c8f22\" (UID: \"739887b5-daec-4bee-944e-d718b3baebc5\") " pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.265361 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97wsz\" (UniqueName: \"kubernetes.io/projected/739887b5-daec-4bee-944e-d718b3baebc5-kube-api-access-97wsz\") pod \"node-resolver-c8f22\" (UID: \"739887b5-daec-4bee-944e-d718b3baebc5\") " pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.290465 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.303949 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.320909 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.340185 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.365534 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.366020 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97wsz\" (UniqueName: \"kubernetes.io/projected/739887b5-daec-4bee-944e-d718b3baebc5-kube-api-access-97wsz\") pod \"node-resolver-c8f22\" (UID: \"739887b5-daec-4bee-944e-d718b3baebc5\") " pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.366096 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/739887b5-daec-4bee-944e-d718b3baebc5-hosts-file\") pod \"node-resolver-c8f22\" (UID: \"739887b5-daec-4bee-944e-d718b3baebc5\") " pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.366164 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/739887b5-daec-4bee-944e-d718b3baebc5-hosts-file\") pod \"node-resolver-c8f22\" (UID: \"739887b5-daec-4bee-944e-d718b3baebc5\") " pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.377681 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.389550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97wsz\" (UniqueName: \"kubernetes.io/projected/739887b5-daec-4bee-944e-d718b3baebc5-kube-api-access-97wsz\") pod \"node-resolver-c8f22\" (UID: \"739887b5-daec-4bee-944e-d718b3baebc5\") " pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.392935 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.406536 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.421688 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.437056 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:58Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.502730 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.502846 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:54:58 crc kubenswrapper[4744]: E0930 02:54:58.503089 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.502964 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:54:58 crc kubenswrapper[4744]: E0930 02:54:58.503431 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:54:58 crc kubenswrapper[4744]: E0930 02:54:58.503424 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.554780 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c8f22" Sep 30 02:54:58 crc kubenswrapper[4744]: I0930 02:54:58.659202 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c8f22" event={"ID":"739887b5-daec-4bee-944e-d718b3baebc5","Type":"ContainerStarted","Data":"62bae3531044ac93ae056ee9797ffa154276696d7d1bb3cd6b7654e8848fae60"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.036908 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kp8zv"] Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.037240 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.038808 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.039068 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.039435 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.039801 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.039824 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.041642 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c5kw2"] Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.042298 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-v9lx7"] Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.042487 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.043104 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.044474 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nxppc"] Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.044709 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.045150 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.045247 4744 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.046640 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.046772 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.046941 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047206 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047268 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047404 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047501 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047626 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047668 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 02:54:59 crc kubenswrapper[4744]: W0930 02:54:59.047773 4744 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Sep 30 02:54:59 crc kubenswrapper[4744]: E0930 02:54:59.047825 4744 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047853 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047882 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.048151 4744 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.047967 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.058266 4744 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.058334 4744 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.061694 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.061758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.061794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.061821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.061846 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.063652 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071657 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-socket-dir-parent\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071699 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-netns\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071726 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071747 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-system-cni-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071766 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-conf-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071786 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-cni-bin\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071911 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6nvt\" (UniqueName: \"kubernetes.io/projected/6561e3c6-a8d1-4dc8-8bd3-09f042393658-kube-api-access-x6nvt\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cnibin\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.071994 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-bin\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072012 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmplx\" (UniqueName: \"kubernetes.io/projected/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-kube-api-access-cmplx\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072032 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-k8s-cni-cncf-io\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072050 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-multus-certs\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072100 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-kubelet\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-var-lib-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072161 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-os-release\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072194 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-etc-kubernetes\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072311 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-systemd\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovn-node-metrics-cert\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072359 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-system-cni-dir\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072461 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-systemd-units\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072500 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072525 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-os-release\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072546 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6561e3c6-a8d1-4dc8-8bd3-09f042393658-cni-binary-copy\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072583 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-script-lib\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072618 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f425n\" (UniqueName: \"kubernetes.io/projected/f7011cf3-078f-4c08-bef7-f89fe27e51f5-kube-api-access-f425n\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072646 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-cni-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072669 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072690 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-config\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072709 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-env-overrides\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072726 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-hostroot\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072747 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-slash\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072767 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-log-socket\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072789 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-kubelet\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072812 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-ovn\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072831 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-daemon-config\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072879 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-etc-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.072926 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-ovn-kubernetes\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.073057 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-netd\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.073131 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-node-log\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.073155 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-cnibin\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.073182 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-cni-multus\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.073234 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.073265 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.073293 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-netns\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.084483 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: E0930 02:54:59.088804 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.092337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.092517 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.092620 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.092711 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.092800 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.097029 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: E0930 02:54:59.104968 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.108279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.108415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.108516 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.108659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.108765 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.110035 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: E0930 02:54:59.124324 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.124658 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.131031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.131054 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.131063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.131076 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.131086 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.144428 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: E0930 02:54:59.148637 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.152224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.152246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.152254 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.152267 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.152276 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: E0930 02:54:59.167071 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: E0930 02:54:59.167232 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.168598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.168616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.168624 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.168638 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.168647 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.170626 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173826 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-systemd\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173851 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovn-node-metrics-cert\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173865 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-system-cni-dir\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173878 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-etc-kubernetes\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173895 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-systemd-units\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173909 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173923 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-os-release\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173938 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6561e3c6-a8d1-4dc8-8bd3-09f042393658-cni-binary-copy\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173952 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-script-lib\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173966 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f425n\" (UniqueName: \"kubernetes.io/projected/f7011cf3-078f-4c08-bef7-f89fe27e51f5-kube-api-access-f425n\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173979 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-cni-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.173992 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-env-overrides\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174006 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-hostroot\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174034 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-config\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174050 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-kubelet\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174070 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a0ffb258-115f-4a60-92da-91d4a9036c10-rootfs\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174086 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0ffb258-115f-4a60-92da-91d4a9036c10-proxy-tls\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174102 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-slash\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174118 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-log-socket\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174134 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-ovn\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-daemon-config\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174181 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-etc-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174219 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-netd\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174235 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-ovn-kubernetes\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-cnibin\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174263 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-cni-multus\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174278 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0ffb258-115f-4a60-92da-91d4a9036c10-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174293 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-node-log\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174308 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174321 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174335 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-netns\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174350 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbn2p\" (UniqueName: \"kubernetes.io/projected/a0ffb258-115f-4a60-92da-91d4a9036c10-kube-api-access-nbn2p\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174386 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-netns\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174402 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174415 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-system-cni-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174429 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-socket-dir-parent\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174443 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-conf-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174458 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-cni-bin\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174472 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6nvt\" (UniqueName: \"kubernetes.io/projected/6561e3c6-a8d1-4dc8-8bd3-09f042393658-kube-api-access-x6nvt\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cnibin\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174499 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmplx\" (UniqueName: \"kubernetes.io/projected/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-kube-api-access-cmplx\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174513 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-k8s-cni-cncf-io\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174525 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-multus-certs\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174540 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-bin\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174559 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-kubelet\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174573 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-var-lib-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174588 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-os-release\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174664 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-os-release\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.174693 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-systemd\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175207 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-system-cni-dir\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175250 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-ovn-kubernetes\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175359 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-k8s-cni-cncf-io\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175420 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175460 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-system-cni-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175512 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-systemd-units\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175510 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-conf-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175595 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-kubelet\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175610 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-os-release\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175620 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-multus-certs\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175643 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-log-socket\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175648 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-bin\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-cni-bin\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175692 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-kubelet\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175700 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cnibin\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-slash\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175738 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-var-lib-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175784 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-socket-dir-parent\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175822 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-run-netns\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175822 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-hostroot\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175823 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-netns\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175837 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-ovn\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175850 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-etc-openvswitch\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175884 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-netd\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175914 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-cnibin\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-host-var-lib-cni-multus\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176067 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-cni-dir\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176094 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-node-log\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-config\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176326 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7011cf3-078f-4c08-bef7-f89fe27e51f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.175389 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6561e3c6-a8d1-4dc8-8bd3-09f042393658-etc-kubernetes\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176673 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6561e3c6-a8d1-4dc8-8bd3-09f042393658-multus-daemon-config\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176707 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176747 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-env-overrides\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176807 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-script-lib\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7011cf3-078f-4c08-bef7-f89fe27e51f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.178871 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovn-node-metrics-cert\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.176670 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6561e3c6-a8d1-4dc8-8bd3-09f042393658-cni-binary-copy\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.193857 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6nvt\" (UniqueName: \"kubernetes.io/projected/6561e3c6-a8d1-4dc8-8bd3-09f042393658-kube-api-access-x6nvt\") pod \"multus-nxppc\" (UID: \"6561e3c6-a8d1-4dc8-8bd3-09f042393658\") " pod="openshift-multus/multus-nxppc" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.194629 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmplx\" (UniqueName: \"kubernetes.io/projected/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-kube-api-access-cmplx\") pod \"ovnkube-node-c5kw2\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.195998 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f425n\" (UniqueName: \"kubernetes.io/projected/f7011cf3-078f-4c08-bef7-f89fe27e51f5-kube-api-access-f425n\") pod \"multus-additional-cni-plugins-v9lx7\" (UID: \"f7011cf3-078f-4c08-bef7-f89fe27e51f5\") " pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.196346 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.213725 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.228493 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.271153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.271204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.271213 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.271231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.271241 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.275792 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbn2p\" (UniqueName: \"kubernetes.io/projected/a0ffb258-115f-4a60-92da-91d4a9036c10-kube-api-access-nbn2p\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.276058 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a0ffb258-115f-4a60-92da-91d4a9036c10-rootfs\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.276109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0ffb258-115f-4a60-92da-91d4a9036c10-proxy-tls\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.276177 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0ffb258-115f-4a60-92da-91d4a9036c10-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.276191 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a0ffb258-115f-4a60-92da-91d4a9036c10-rootfs\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.277017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0ffb258-115f-4a60-92da-91d4a9036c10-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.279733 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0ffb258-115f-4a60-92da-91d4a9036c10-proxy-tls\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.311788 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.324970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbn2p\" (UniqueName: \"kubernetes.io/projected/a0ffb258-115f-4a60-92da-91d4a9036c10-kube-api-access-nbn2p\") pod \"machine-config-daemon-kp8zv\" (UID: \"a0ffb258-115f-4a60-92da-91d4a9036c10\") " pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.351707 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.354128 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.363539 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.376312 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.380676 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.380737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.380747 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.380782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.380795 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.393899 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.416595 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.436812 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.451688 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.469574 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.483742 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.483947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.484073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.484152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.484211 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.484245 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.497920 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.511890 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.536347 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.549720 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.563649 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.584469 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.586360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.586509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.586633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.586727 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.586820 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.601639 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.664889 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c8f22" event={"ID":"739887b5-daec-4bee-944e-d718b3baebc5","Type":"ContainerStarted","Data":"6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.669883 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee" exitCode=0 Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.669941 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.669996 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"3ba22fc8f802c825101b75a2921185f90beaf2cc982907ead9558a1118b4e1e2"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.673628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.673715 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.673741 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"eb95ac990d293d07f4bbea3f04222f2088bd0c58854ad5318e423ed609f3ca1b"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.675113 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerStarted","Data":"7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.675142 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerStarted","Data":"12650796f986703e1a0e3c0e0a1c8e367337158d5df4a09ffb0877050a68ef7c"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.687789 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.689415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.689453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.689467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.689488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.689502 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.711939 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.724937 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.745175 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.761013 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.778696 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.791799 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.791838 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.791848 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.791861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.791871 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.803855 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.824488 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.845894 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.860031 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.877187 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.891595 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.893310 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.893452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.893463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.893481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.893492 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.915312 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.930215 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.943638 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.962860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.982352 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.995642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.995668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.995676 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.995690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.995700 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:54:59Z","lastTransitionTime":"2025-09-30T02:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:54:59 crc kubenswrapper[4744]: I0930 02:54:59.998486 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:54:59Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.014552 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.040381 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.061466 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.078543 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.098347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.098409 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.098421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.098437 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.098449 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.108240 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.122401 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.136054 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.149731 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.163508 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.180770 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.201029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.201095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.201114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.201137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.201153 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.304495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.304530 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.304539 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.304554 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.304566 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.382277 4744 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-multus/multus-nxppc" secret="" err="failed to sync secret cache: timed out waiting for the condition" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.382350 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nxppc" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.387174 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.387330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.387387 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.387522 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.387585 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:08.387568021 +0000 UTC m=+35.560788015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.387650 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:55:08.387639144 +0000 UTC m=+35.560859128 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.387721 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.387755 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:08.387747287 +0000 UTC m=+35.560967281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.407346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.407421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.407438 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.407459 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.407475 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: W0930 02:55:00.409063 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6561e3c6_a8d1_4dc8_8bd3_09f042393658.slice/crio-5cc5684052f1dced44e3fe45d5d416a75ef0d286a6f06343ab2272f53f31bde8 WatchSource:0}: Error finding container 5cc5684052f1dced44e3fe45d5d416a75ef0d286a6f06343ab2272f53f31bde8: Status 404 returned error can't find the container with id 5cc5684052f1dced44e3fe45d5d416a75ef0d286a6f06343ab2272f53f31bde8 Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.488651 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.489265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.489541 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.489583 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.489606 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.489693 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:08.489665265 +0000 UTC m=+35.662885279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.490362 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.490447 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.490471 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.490529 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:08.490512182 +0000 UTC m=+35.663732186 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.507336 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.507510 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.507516 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.507707 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.507344 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:00 crc kubenswrapper[4744]: E0930 02:55:00.508160 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.509856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.509882 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.509893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.509908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.509916 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.602659 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.612714 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.612786 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.612808 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.612838 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.612860 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.680965 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.681008 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.681017 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.681027 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.681036 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.681044 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.681719 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerStarted","Data":"5cc5684052f1dced44e3fe45d5d416a75ef0d286a6f06343ab2272f53f31bde8"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.683015 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7011cf3-078f-4c08-bef7-f89fe27e51f5" containerID="7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d" exitCode=0 Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.683144 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerDied","Data":"7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.698731 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.708954 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.715979 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.716075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.716101 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.716129 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.716151 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.721183 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.733169 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.744722 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.762865 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.776430 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.804472 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.818739 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.823680 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.823717 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.823728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.823743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.823754 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.831707 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.844948 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.858579 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.873783 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.887034 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:00Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.925320 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.925809 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.925819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.925835 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:00 crc kubenswrapper[4744]: I0930 02:55:00.925845 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:00Z","lastTransitionTime":"2025-09-30T02:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.028285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.028357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.028386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.028406 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.028420 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.132222 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.132265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.132275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.132295 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.132307 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.235351 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.235435 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.235446 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.235472 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.235487 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.338059 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.338102 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.338114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.338137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.338152 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.440962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.441003 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.441015 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.441033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.441044 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.544060 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.544140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.544162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.544196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.544218 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.647456 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.647511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.647523 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.647538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.647548 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.684700 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-twwm8"] Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.685217 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.688646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerStarted","Data":"d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.689648 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.689934 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.690114 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.693702 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.698031 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7011cf3-078f-4c08-bef7-f89fe27e51f5" containerID="6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c" exitCode=0 Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.698083 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerDied","Data":"6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.701318 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcpnn\" (UniqueName: \"kubernetes.io/projected/c27f07dd-93c3-4287-9a8c-c6c0e7724776-kube-api-access-zcpnn\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.701541 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c27f07dd-93c3-4287-9a8c-c6c0e7724776-host\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.701586 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c27f07dd-93c3-4287-9a8c-c6c0e7724776-serviceca\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.711353 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.728551 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.750092 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.750328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.750432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.750452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.750480 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.750499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.771178 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.790054 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.802304 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c27f07dd-93c3-4287-9a8c-c6c0e7724776-host\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.802352 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c27f07dd-93c3-4287-9a8c-c6c0e7724776-serviceca\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.802416 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcpnn\" (UniqueName: \"kubernetes.io/projected/c27f07dd-93c3-4287-9a8c-c6c0e7724776-kube-api-access-zcpnn\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.802561 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c27f07dd-93c3-4287-9a8c-c6c0e7724776-host\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.804127 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c27f07dd-93c3-4287-9a8c-c6c0e7724776-serviceca\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.806391 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.820886 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.825568 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcpnn\" (UniqueName: \"kubernetes.io/projected/c27f07dd-93c3-4287-9a8c-c6c0e7724776-kube-api-access-zcpnn\") pod \"node-ca-twwm8\" (UID: \"c27f07dd-93c3-4287-9a8c-c6c0e7724776\") " pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.843409 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.852552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.852601 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.852610 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.852623 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.852635 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.858277 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.874137 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.888960 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.903229 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.915329 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.928296 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.938633 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.954659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.954691 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.954699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.954713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.954734 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:01Z","lastTransitionTime":"2025-09-30T02:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.958223 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.970693 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.984278 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:01 crc kubenswrapper[4744]: I0930 02:55:01.996214 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.006684 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-twwm8" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.009881 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.026067 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.046391 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: W0930 02:55:02.055153 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27f07dd_93c3_4287_9a8c_c6c0e7724776.slice/crio-3e54129bd9617582c3bb5f1edaa7ad97e514886741e5146b2f1c15d2dc9e156a WatchSource:0}: Error finding container 3e54129bd9617582c3bb5f1edaa7ad97e514886741e5146b2f1c15d2dc9e156a: Status 404 returned error can't find the container with id 3e54129bd9617582c3bb5f1edaa7ad97e514886741e5146b2f1c15d2dc9e156a Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.057010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.057047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.057057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.057072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.057084 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.065112 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.085813 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.100185 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.110932 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.131947 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.147322 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.162061 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.162124 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.162140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.162166 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.162181 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.165650 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.184986 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.263768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.263813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.263825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.263842 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.263855 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.366463 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.366494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.366504 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.366518 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.366526 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.470345 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.470405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.470417 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.470432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.470444 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.502844 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.502929 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:02 crc kubenswrapper[4744]: E0930 02:55:02.502966 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:02 crc kubenswrapper[4744]: E0930 02:55:02.503063 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.502927 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:02 crc kubenswrapper[4744]: E0930 02:55:02.503438 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.572927 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.572971 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.572980 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.572996 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.573006 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.676676 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.676765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.676791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.676824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.676900 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.704088 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7011cf3-078f-4c08-bef7-f89fe27e51f5" containerID="d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684" exitCode=0 Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.704234 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerDied","Data":"d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.705733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-twwm8" event={"ID":"c27f07dd-93c3-4287-9a8c-c6c0e7724776","Type":"ContainerStarted","Data":"58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.705789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-twwm8" event={"ID":"c27f07dd-93c3-4287-9a8c-c6c0e7724776","Type":"ContainerStarted","Data":"3e54129bd9617582c3bb5f1edaa7ad97e514886741e5146b2f1c15d2dc9e156a"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.709975 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.727080 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.739096 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.754240 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.767397 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.780174 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.780207 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.780215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.780231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.780242 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.782626 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.799220 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.819918 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.838076 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.854747 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.875067 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.883859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.883909 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.883918 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.883935 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.883947 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.888860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.905088 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.921864 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.937280 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.950167 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.969106 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.988618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.988664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.988679 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.988702 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.988716 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:02Z","lastTransitionTime":"2025-09-30T02:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:02 crc kubenswrapper[4744]: I0930 02:55:02.997901 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:02Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.012316 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.036739 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.049630 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.062553 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.073204 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.088643 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.091498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.091528 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.091540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.091555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.091567 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.102133 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.112426 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.124147 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.136299 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.147539 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.166249 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.178987 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.193581 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.193625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.193639 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.193658 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.193670 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.296419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.296458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.296468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.296483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.296493 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.399468 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.399548 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.399567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.400298 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.400345 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.504357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.504415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.504427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.504449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.504462 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.515905 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.529519 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.540927 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.563850 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.584835 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.607556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.607603 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.607613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.607628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.607638 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.612343 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.635166 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.653243 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.673265 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.686090 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.702655 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.709845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.709879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.709891 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.709911 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.709922 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.715269 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.715730 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7011cf3-078f-4c08-bef7-f89fe27e51f5" containerID="967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc" exitCode=0 Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.715761 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerDied","Data":"967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.733346 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.743741 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.756301 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.768348 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.784187 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.797151 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.813628 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.814455 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.814485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.814496 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.814513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.814526 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.829685 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.843858 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.861902 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.875074 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.897625 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.910509 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.918075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.918106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.918117 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.918133 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.918141 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:03Z","lastTransitionTime":"2025-09-30T02:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.922268 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.931525 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.945035 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.956057 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:03 crc kubenswrapper[4744]: I0930 02:55:03.966531 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.020839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.020881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.020891 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.020906 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.020919 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.123818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.124215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.124234 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.124256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.124271 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.227550 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.227584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.227593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.227606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.227615 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.330328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.330390 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.330404 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.330423 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.330434 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.435026 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.435078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.435095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.435119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.435137 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.502878 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.502942 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.502968 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:04 crc kubenswrapper[4744]: E0930 02:55:04.503009 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:04 crc kubenswrapper[4744]: E0930 02:55:04.503192 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:04 crc kubenswrapper[4744]: E0930 02:55:04.503326 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.538778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.538948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.539010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.539072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.539131 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.641233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.641269 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.641279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.641294 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.641303 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.722575 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.723147 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.723183 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.728352 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7011cf3-078f-4c08-bef7-f89fe27e51f5" containerID="a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0" exitCode=0 Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.728418 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerDied","Data":"a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.738244 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.751867 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.752637 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.752648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.752666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.752687 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.757185 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.758452 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.760698 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.774603 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.792321 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.811152 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.829664 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.841492 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.854720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.854746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.854755 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.854769 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.854781 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.862308 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.873503 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.885759 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.897740 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.909397 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.920837 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.931855 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.942040 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.957238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.957274 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.957288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.957309 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.957322 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:04Z","lastTransitionTime":"2025-09-30T02:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.964944 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.977855 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:04 crc kubenswrapper[4744]: I0930 02:55:04.989811 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.008618 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.022042 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.040071 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.055772 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.060263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.060291 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.060300 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.060314 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.060325 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.071975 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.084696 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.095444 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.106948 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.125642 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.138699 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.156419 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.162481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.162619 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.162678 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.162735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.162792 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.171950 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.265568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.265864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.265933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.265993 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.266062 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.368923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.368977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.368992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.369009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.369023 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.472502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.473149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.473198 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.473230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.473248 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.576043 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.576081 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.576091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.576106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.576115 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.678281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.678324 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.678335 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.678349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.678360 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.735880 4744 generic.go:334] "Generic (PLEG): container finished" podID="f7011cf3-078f-4c08-bef7-f89fe27e51f5" containerID="485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499" exitCode=0 Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.735931 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerDied","Data":"485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.736018 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.762488 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.780669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.780719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.780731 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.780753 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.780766 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.781872 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.800844 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.816493 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.838643 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.854153 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.864254 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.882556 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.883313 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.883347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.883355 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.883373 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.883384 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.898620 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.912227 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.922221 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.940216 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.955587 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.972304 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.987007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.987061 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.987073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.987093 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.987105 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:05Z","lastTransitionTime":"2025-09-30T02:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:05 crc kubenswrapper[4744]: I0930 02:55:05.990648 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.089707 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.089762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.089770 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.089785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.089795 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.191908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.191943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.191951 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.191967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.191976 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.294392 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.294435 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.294444 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.294461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.294470 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.396578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.396635 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.396647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.396664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.396675 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.498723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.498768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.498778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.498794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.498803 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.503522 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.503529 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.503588 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:06 crc kubenswrapper[4744]: E0930 02:55:06.503628 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:06 crc kubenswrapper[4744]: E0930 02:55:06.503725 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:06 crc kubenswrapper[4744]: E0930 02:55:06.503825 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.600932 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.600978 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.600987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.601002 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.601011 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.703917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.703949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.703957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.703970 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.703979 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.743207 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" event={"ID":"f7011cf3-078f-4c08-bef7-f89fe27e51f5","Type":"ContainerStarted","Data":"c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.743275 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.767253 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.782103 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.802718 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.806853 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.806900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.806913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.806933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.806944 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.820088 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.849896 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.867660 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.880449 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.898980 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.909256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.909293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.909304 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.909320 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.909338 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:06Z","lastTransitionTime":"2025-09-30T02:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.912915 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.925431 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.935905 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.945705 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.954904 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.965691 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:06 crc kubenswrapper[4744]: I0930 02:55:06.975689 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.011144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.011174 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.011183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.011195 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.011209 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.113871 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.113902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.113912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.113926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.113935 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.232154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.232212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.232228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.232242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.232251 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.334206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.334250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.334261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.334277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.334287 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.437206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.437261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.437271 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.437284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.437292 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.539510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.539542 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.539552 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.539566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.539576 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.642177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.642209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.642235 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.642248 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.642256 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.744644 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.744689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.744700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.744715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.744733 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.746334 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/0.log" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.748634 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609" exitCode=1 Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.748666 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.752594 4744 scope.go:117] "RemoveContainer" containerID="70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.765565 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.774945 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.788201 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.801108 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.813295 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.826398 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.838030 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.846856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.846890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.846899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.846916 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.846928 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.856980 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.869593 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.884175 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.900740 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:07Z\\\",\\\"message\\\":\\\"70 6020 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 02:55:07.379180 6020 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:07.379187 6020 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 02:55:07.379189 6020 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379195 6020 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 02:55:07.379612 6020 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379698 6020 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.379848 6020 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.380022 6020 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:07.380182 6020 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.913503 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.925134 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.935671 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.945379 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:07Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.948646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.948675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.948686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.948702 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:07 crc kubenswrapper[4744]: I0930 02:55:07.948714 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:07Z","lastTransitionTime":"2025-09-30T02:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.051466 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.051511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.051523 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.051538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.051550 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.153699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.153746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.153757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.153774 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.153785 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.255722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.255753 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.255764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.255778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.255786 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.358328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.358398 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.358407 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.358429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.358440 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.460773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.460815 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.460826 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.460843 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.460853 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.477286 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.477401 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.477425 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.477477 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.477486 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:55:24.477460247 +0000 UTC m=+51.650680221 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.477523 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:24.477508379 +0000 UTC m=+51.650728353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.477530 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.477599 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:24.477581391 +0000 UTC m=+51.650801355 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.502897 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.502979 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.503134 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.503143 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.503205 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.503274 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.562356 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.562405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.562415 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.562429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.562440 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.578092 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.578131 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578230 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578255 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578266 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578302 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:24.578290082 +0000 UTC m=+51.751510056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578597 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578626 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578638 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.578703 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:24.578681824 +0000 UTC m=+51.751901798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.663881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.663914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.663922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.663936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.663948 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.752821 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/1.log" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.753416 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/0.log" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.755671 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84" exitCode=1 Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.755718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.755775 4744 scope.go:117] "RemoveContainer" containerID="70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.757706 4744 scope.go:117] "RemoveContainer" containerID="28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84" Sep 30 02:55:08 crc kubenswrapper[4744]: E0930 02:55:08.758564 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.766160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.766203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.766215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.766231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.766242 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.767085 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.778603 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.790604 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.805407 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.823086 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.840041 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.854447 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.868913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.868958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.868969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.868984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.868995 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.872861 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:07Z\\\",\\\"message\\\":\\\"70 6020 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 02:55:07.379180 6020 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:07.379187 6020 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 02:55:07.379189 6020 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379195 6020 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 02:55:07.379612 6020 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379698 6020 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.379848 6020 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.380022 6020 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:07.380182 6020 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.885267 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.902826 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.914522 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.925166 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.934756 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.948347 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.962020 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:08Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.972006 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.972034 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.972042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.972055 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:08 crc kubenswrapper[4744]: I0930 02:55:08.972065 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:08Z","lastTransitionTime":"2025-09-30T02:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.074612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.074653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.074661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.074674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.074684 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.177182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.177217 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.177224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.177237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.177247 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.279080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.279117 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.279126 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.279140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.279149 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.425664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.425703 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.425715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.425732 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.425745 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.462170 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.462204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.462230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.462243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.462252 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: E0930 02:55:09.472551 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:09Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.475253 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.475308 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.475321 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.475337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.475348 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: E0930 02:55:09.485644 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:09Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.488604 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.488637 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.488648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.488665 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.488679 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: E0930 02:55:09.499225 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:09Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.502561 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.502610 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.502620 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.502636 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.502646 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: E0930 02:55:09.513913 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:09Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.517056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.517095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.517103 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.517119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.517128 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: E0930 02:55:09.527630 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:09Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:09 crc kubenswrapper[4744]: E0930 02:55:09.527745 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.529021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.529080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.529095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.529108 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.529117 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.631033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.631069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.631077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.631091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.631101 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.733670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.733973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.733983 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.733997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.734009 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.770001 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/1.log" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.836579 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.836608 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.836617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.836629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.836637 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.938730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.938781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.938794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.938812 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:09 crc kubenswrapper[4744]: I0930 02:55:09.938824 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:09Z","lastTransitionTime":"2025-09-30T02:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.041376 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.041444 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.041462 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.041481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.041493 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.143950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.143988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.143996 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.144009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.144018 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.246146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.246178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.246188 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.246200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.246208 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.348784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.348815 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.348822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.348837 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.348845 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.451931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.451989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.452010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.452037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.452059 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.502820 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.502873 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:10 crc kubenswrapper[4744]: E0930 02:55:10.502917 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.502821 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:10 crc kubenswrapper[4744]: E0930 02:55:10.503028 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:10 crc kubenswrapper[4744]: E0930 02:55:10.503116 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.554086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.554123 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.554131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.554144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.554153 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.655986 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.656018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.656027 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.656044 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.656063 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.758850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.758889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.758907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.758924 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.758934 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.862053 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.862092 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.862106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.862126 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.862141 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.965405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.965449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.965460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.965477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:10 crc kubenswrapper[4744]: I0930 02:55:10.965486 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:10Z","lastTransitionTime":"2025-09-30T02:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.069220 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.069261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.069273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.069291 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.069303 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.172634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.172675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.172683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.172699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.172709 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.276816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.276855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.276866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.276881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.276889 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.279267 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8"] Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.280165 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.282855 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.283039 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.294199 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.314070 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.327326 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.338485 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.341541 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.341591 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.341636 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx96k\" (UniqueName: \"kubernetes.io/projected/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-kube-api-access-bx96k\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.341673 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.349487 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.360396 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.369333 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.378827 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.378873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.378884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.378903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.378919 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.379605 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.393091 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.405831 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.419457 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.438436 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:07Z\\\",\\\"message\\\":\\\"70 6020 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 02:55:07.379180 6020 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:07.379187 6020 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 02:55:07.379189 6020 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379195 6020 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 02:55:07.379612 6020 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379698 6020 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.379848 6020 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.380022 6020 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:07.380182 6020 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.442522 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.442577 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx96k\" (UniqueName: \"kubernetes.io/projected/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-kube-api-access-bx96k\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.442614 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.442657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.443476 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.443867 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.452957 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.453419 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.460779 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx96k\" (UniqueName: \"kubernetes.io/projected/23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4-kube-api-access-bx96k\") pod \"ovnkube-control-plane-749d76644c-s92q8\" (UID: \"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.479482 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.482437 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.482471 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.482480 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.482499 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.482510 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.493141 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.508189 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.585593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.585661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.585676 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.585703 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.585723 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.602469 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.688834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.688930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.688946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.688967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.688993 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.786100 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" event={"ID":"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4","Type":"ContainerStarted","Data":"a1049cd4bf26449088aad57491aee7dbb301f8096f6b88269e65286c2abed437"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.791165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.791221 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.791237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.791261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.791275 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.895671 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.895756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.895780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.895811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.895832 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.998988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.999022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.999030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.999047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:11 crc kubenswrapper[4744]: I0930 02:55:11.999055 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:11Z","lastTransitionTime":"2025-09-30T02:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.101704 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.101762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.101777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.101794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.101805 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.204087 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.204130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.204140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.204160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.204172 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.308079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.308196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.308208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.308229 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.308240 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.411925 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.411995 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.412014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.412042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.412065 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.502858 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:12 crc kubenswrapper[4744]: E0930 02:55:12.503053 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.502892 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:12 crc kubenswrapper[4744]: E0930 02:55:12.503142 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.502858 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:12 crc kubenswrapper[4744]: E0930 02:55:12.503205 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.514679 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.514726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.514736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.514758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.514770 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.617701 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.617774 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.617798 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.617834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.617866 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.720229 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.720277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.720289 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.720306 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.720316 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.739613 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zd85c"] Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.740089 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:12 crc kubenswrapper[4744]: E0930 02:55:12.740164 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.771731 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.793217 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" event={"ID":"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4","Type":"ContainerStarted","Data":"21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.793289 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" event={"ID":"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4","Type":"ContainerStarted","Data":"817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.798212 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.819365 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.823046 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.823119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.823132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.823152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.823163 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.852082 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:07Z\\\",\\\"message\\\":\\\"70 6020 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 02:55:07.379180 6020 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:07.379187 6020 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 02:55:07.379189 6020 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379195 6020 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 02:55:07.379612 6020 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379698 6020 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.379848 6020 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.380022 6020 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:07.380182 6020 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.856560 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.856640 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djtqk\" (UniqueName: \"kubernetes.io/projected/d91f1289-b199-4e91-9bbd-78ec9a433706-kube-api-access-djtqk\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.871640 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.889446 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.907817 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.925702 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.927875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.927954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.927970 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.927989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.928020 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:12Z","lastTransitionTime":"2025-09-30T02:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.940468 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.956237 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.957867 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:12 crc kubenswrapper[4744]: E0930 02:55:12.958182 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:12 crc kubenswrapper[4744]: E0930 02:55:12.958335 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:13.458309714 +0000 UTC m=+40.631529688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.958551 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djtqk\" (UniqueName: \"kubernetes.io/projected/d91f1289-b199-4e91-9bbd-78ec9a433706-kube-api-access-djtqk\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.973201 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.976146 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djtqk\" (UniqueName: \"kubernetes.io/projected/d91f1289-b199-4e91-9bbd-78ec9a433706-kube-api-access-djtqk\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:12 crc kubenswrapper[4744]: I0930 02:55:12.988680 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:12Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.008976 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.030485 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.031343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.031409 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.031418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.031435 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.031449 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.054237 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.080123 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.096288 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.119229 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.134166 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.135202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.135256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.135270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.135293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.135309 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.149854 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.165412 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.180556 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.198161 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.216513 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.231330 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.237647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.237875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.238080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.238283 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.238463 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.247784 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.270740 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.287550 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.301039 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.316298 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.335773 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:07Z\\\",\\\"message\\\":\\\"70 6020 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 02:55:07.379180 6020 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:07.379187 6020 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 02:55:07.379189 6020 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379195 6020 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 02:55:07.379612 6020 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379698 6020 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.379848 6020 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.380022 6020 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:07.380182 6020 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.341414 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.341481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.341500 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.341525 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.341543 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.353032 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.373249 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.386704 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.445154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.445209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.445219 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.445236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.445246 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.463413 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:13 crc kubenswrapper[4744]: E0930 02:55:13.463621 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:13 crc kubenswrapper[4744]: E0930 02:55:13.464030 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:14.463993486 +0000 UTC m=+41.637213490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.539737 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.548897 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.549156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.549285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.549449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.549575 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.558404 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.581117 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.607860 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70f3469591bd779340eed03b3e21b1f6833d02d8d74bcd55ad6d5a2a8cfc9609\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:07Z\\\",\\\"message\\\":\\\"70 6020 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 02:55:07.379180 6020 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:07.379187 6020 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 02:55:07.379189 6020 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379195 6020 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 02:55:07.379612 6020 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 02:55:07.379698 6020 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.379848 6020 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 02:55:07.380022 6020 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:07.380182 6020 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.625814 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.639197 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.653687 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.653774 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.653841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.653864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.653890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.653908 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.668793 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.686315 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.700949 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.712802 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.724103 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.736960 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.751255 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.756501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.756545 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.756558 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.756622 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.756637 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.764548 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.777923 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.799150 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.859276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.859343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.859352 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.859367 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.859392 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.963026 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.963073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.963081 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.963098 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:13 crc kubenswrapper[4744]: I0930 02:55:13.963106 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:13Z","lastTransitionTime":"2025-09-30T02:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.065630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.065679 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.065691 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.065709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.065721 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.168911 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.168977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.168998 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.169025 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.169044 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.272325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.272713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.272726 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.272746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.272759 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.375911 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.375963 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.375975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.375997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.376009 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.475161 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:14 crc kubenswrapper[4744]: E0930 02:55:14.475315 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:14 crc kubenswrapper[4744]: E0930 02:55:14.475428 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:16.47540032 +0000 UTC m=+43.648620294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.479355 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.479416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.479425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.479440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.479449 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.502660 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.502749 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.502768 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.502683 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:14 crc kubenswrapper[4744]: E0930 02:55:14.502901 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:14 crc kubenswrapper[4744]: E0930 02:55:14.503010 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:14 crc kubenswrapper[4744]: E0930 02:55:14.503116 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:14 crc kubenswrapper[4744]: E0930 02:55:14.503253 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.582270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.582318 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.582329 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.582349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.582362 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.684767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.685103 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.685241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.685371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.685512 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.791105 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.791371 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.791416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.791440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.791461 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.893908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.893956 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.893968 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.893987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.894002 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.996918 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.996964 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.996973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.996990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:14 crc kubenswrapper[4744]: I0930 02:55:14.997002 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:14Z","lastTransitionTime":"2025-09-30T02:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.099362 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.099462 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.099486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.099519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.099547 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.201950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.201992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.202000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.202014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.202023 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.304195 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.304263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.304284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.304311 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.304329 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.407001 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.407039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.407047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.407062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.407072 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.509685 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.509743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.509759 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.509784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.509800 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.612631 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.612684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.612730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.612751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.612764 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.715312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.715389 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.715406 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.715427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.715442 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.817421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.817462 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.817470 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.817485 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.817498 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.919555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.919628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.919656 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.919688 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:15 crc kubenswrapper[4744]: I0930 02:55:15.919711 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:15Z","lastTransitionTime":"2025-09-30T02:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.022863 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.022907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.022918 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.022936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.022948 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.125111 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.125155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.125164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.125179 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.125192 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.228313 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.228360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.228402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.228422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.228437 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.331349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.331469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.331488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.331513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.331532 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.434230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.434303 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.434327 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.434357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.434437 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.503586 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.503666 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.503685 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.503632 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:16 crc kubenswrapper[4744]: E0930 02:55:16.503817 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:16 crc kubenswrapper[4744]: E0930 02:55:16.503893 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:16 crc kubenswrapper[4744]: E0930 02:55:16.503987 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:16 crc kubenswrapper[4744]: E0930 02:55:16.504098 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.507982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:16 crc kubenswrapper[4744]: E0930 02:55:16.508155 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:16 crc kubenswrapper[4744]: E0930 02:55:16.508217 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:20.508198198 +0000 UTC m=+47.681418182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.537266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.537312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.537324 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.537343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.537356 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.639813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.639848 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.639858 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.639873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.639884 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.742454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.742490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.742500 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.742513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.742523 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.844980 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.845035 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.845046 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.845069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.845080 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.948344 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.948427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.948439 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.948458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:16 crc kubenswrapper[4744]: I0930 02:55:16.948472 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:16Z","lastTransitionTime":"2025-09-30T02:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.050989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.051082 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.051116 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.051155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.051178 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.154510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.154585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.154609 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.154636 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.154657 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.257870 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.258326 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.258406 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.258436 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.258457 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.362100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.362184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.362204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.362236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.362255 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.465683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.465738 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.465755 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.465779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.465795 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.486732 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.488699 4744 scope.go:117] "RemoveContainer" containerID="28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84" Sep 30 02:55:17 crc kubenswrapper[4744]: E0930 02:55:17.488987 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.512440 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.532862 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.563207 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.568886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.568925 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.568935 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.568950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.568961 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.586961 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.623269 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.640894 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.653428 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.665505 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.671693 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.671746 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.671763 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.671785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.671802 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.679537 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.695912 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.710541 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.722869 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.741348 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.754579 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.774977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.775667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.776058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.776828 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.777205 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.794553 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.817053 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.835121 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:17Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.879797 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.879858 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.879875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.879902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.879918 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.982500 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.982540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.982549 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.982566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:17 crc kubenswrapper[4744]: I0930 02:55:17.982576 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:17Z","lastTransitionTime":"2025-09-30T02:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.085818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.085910 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.085929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.085958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.085992 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.188969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.189033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.189049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.189074 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.189091 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.291812 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.292119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.292204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.292295 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.292393 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.395518 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.395773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.395892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.395977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.396060 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.498711 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.498782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.498797 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.498820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.498836 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.503072 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.503117 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:18 crc kubenswrapper[4744]: E0930 02:55:18.503193 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.503304 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:18 crc kubenswrapper[4744]: E0930 02:55:18.503556 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:18 crc kubenswrapper[4744]: E0930 02:55:18.503418 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.503393 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:18 crc kubenswrapper[4744]: E0930 02:55:18.503883 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.601209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.601263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.601276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.601295 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.601307 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.704349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.704418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.704430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.704453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.704467 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.808063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.808130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.808148 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.808175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.808192 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.911633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.911736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.911789 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.911819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:18 crc kubenswrapper[4744]: I0930 02:55:18.911870 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:18Z","lastTransitionTime":"2025-09-30T02:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.015018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.015079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.015097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.015122 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.015140 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.117967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.118039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.118059 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.118086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.118103 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.220842 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.220898 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.220917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.220944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.220965 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.323561 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.323606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.323618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.323634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.323647 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.427748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.427809 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.427827 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.427856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.427874 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.530705 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.530821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.530844 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.530872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.530891 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.634049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.634114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.634133 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.634181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.634203 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.643266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.643294 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.643302 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.643315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.643327 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: E0930 02:55:19.662109 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:19Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.667174 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.667224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.667238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.667260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.667275 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: E0930 02:55:19.683193 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:19Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.690808 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.690871 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.690885 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.690902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.690917 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: E0930 02:55:19.705979 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:19Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.711051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.711117 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.711137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.711168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.711193 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: E0930 02:55:19.727164 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:19Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.731265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.731359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.731427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.731456 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.731509 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: E0930 02:55:19.746229 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:19Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:19 crc kubenswrapper[4744]: E0930 02:55:19.746555 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.748697 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.748744 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.748759 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.748780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.748793 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.851167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.851206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.851214 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.851228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.851237 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.953190 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.953246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.953257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.953275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:19 crc kubenswrapper[4744]: I0930 02:55:19.953287 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:19Z","lastTransitionTime":"2025-09-30T02:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.055834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.055878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.055890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.055908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.055921 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.158488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.158538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.158550 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.158567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.158579 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.260940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.260988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.261000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.261017 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.261031 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.363058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.363108 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.363120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.363137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.363150 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.464953 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.464988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.464997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.465011 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.465021 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.502625 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.502663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.502663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.502732 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:20 crc kubenswrapper[4744]: E0930 02:55:20.502893 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:20 crc kubenswrapper[4744]: E0930 02:55:20.503074 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:20 crc kubenswrapper[4744]: E0930 02:55:20.503204 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:20 crc kubenswrapper[4744]: E0930 02:55:20.503301 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.587162 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:20 crc kubenswrapper[4744]: E0930 02:55:20.587357 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:20 crc kubenswrapper[4744]: E0930 02:55:20.587442 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:28.587420008 +0000 UTC m=+55.760640072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.587935 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.587965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.587973 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.587986 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.587998 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.690132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.690184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.690195 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.690214 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.690223 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.792928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.792997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.793020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.793050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.793073 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.895949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.896017 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.896042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.896071 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.896093 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.998675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.998710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.998719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.998733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:20 crc kubenswrapper[4744]: I0930 02:55:20.998741 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:20Z","lastTransitionTime":"2025-09-30T02:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.101053 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.101132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.101147 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.101163 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.101172 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.203314 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.203419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.203445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.203475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.203498 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.306086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.306157 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.306181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.306208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.306226 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.408275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.408309 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.408316 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.408331 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.408339 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.510278 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.510325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.510336 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.510353 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.510365 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.612912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.612985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.613009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.613038 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.613059 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.715268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.715320 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.715333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.715353 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.715365 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.817543 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.817576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.817584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.817598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.817607 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.921386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.921444 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.921465 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.921486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:21 crc kubenswrapper[4744]: I0930 02:55:21.921499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:21Z","lastTransitionTime":"2025-09-30T02:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.024243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.024289 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.024298 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.024315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.024326 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.126194 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.126246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.126258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.126275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.126288 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.228501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.228541 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.228549 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.228566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.228576 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.330985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.331033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.331049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.331066 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.331079 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.433575 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.433617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.433626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.433642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.433652 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.502442 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.502484 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.502482 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:22 crc kubenswrapper[4744]: E0930 02:55:22.502558 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.502641 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:22 crc kubenswrapper[4744]: E0930 02:55:22.502761 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:22 crc kubenswrapper[4744]: E0930 02:55:22.502843 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:22 crc kubenswrapper[4744]: E0930 02:55:22.502891 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.536281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.536315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.536323 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.536340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.536350 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.639124 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.639173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.639186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.639203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.639215 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.743151 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.743203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.743276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.743301 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.743325 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.845683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.845738 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.845748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.845778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.845787 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.947612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.947646 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.947655 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.947669 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:22 crc kubenswrapper[4744]: I0930 02:55:22.947678 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:22Z","lastTransitionTime":"2025-09-30T02:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.049755 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.049793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.049804 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.049820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.049829 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.151674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.151709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.151719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.151734 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.151745 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.254339 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.254400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.254413 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.254430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.254441 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.357045 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.357094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.357102 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.357118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.357129 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.459484 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.459535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.459545 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.459561 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.459570 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.515695 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.528128 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.542497 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.556987 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.561205 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.561233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.561242 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.561256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.561265 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.570531 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.585396 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.597384 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.614456 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.626931 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.644285 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.655222 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.664481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.664517 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.664527 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.664544 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.664558 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.665609 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.674712 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.685456 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.695227 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.703904 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.713453 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:23Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.766734 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.766772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.766783 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.766800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.766811 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.868509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.868548 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.868557 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.868572 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.868580 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.970829 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.970861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.970869 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.970886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:23 crc kubenswrapper[4744]: I0930 02:55:23.970894 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:23Z","lastTransitionTime":"2025-09-30T02:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.072733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.072779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.072790 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.072806 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.072814 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.176171 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.176236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.176289 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.176322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.176340 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.246725 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.259584 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.263872 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.277922 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.279233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.279276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.279292 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.279316 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.279332 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.294045 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.309725 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.323721 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.342510 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.359324 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.369709 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.381870 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.381901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.381912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.381931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.381941 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.395191 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.408870 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.420852 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.435313 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.447800 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.458511 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.471367 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.482058 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.484824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.484856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.484899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.484919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.484931 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.495483 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:24Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.502804 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.502941 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.503084 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.503105 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.503133 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.503353 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.503460 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.503523 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.523406 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.523492 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.523588 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.523602 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:55:56.523580645 +0000 UTC m=+83.696800629 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.523707 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:56.523696489 +0000 UTC m=+83.696916483 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.523829 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.523883 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:56.523873634 +0000 UTC m=+83.697093618 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.523729 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.587211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.587239 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.587246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.587259 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.587268 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.624831 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.624925 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625002 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625023 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625037 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625063 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625085 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625101 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625087 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:56.62507246 +0000 UTC m=+83.798292434 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:24 crc kubenswrapper[4744]: E0930 02:55:24.625160 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:56.625143103 +0000 UTC m=+83.798363097 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.689587 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.689672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.689688 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.689717 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.689750 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.792800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.792841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.792854 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.792875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.792891 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.895946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.895997 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.896009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.896028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.896040 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.998986 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.999040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.999051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.999070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:24 crc kubenswrapper[4744]: I0930 02:55:24.999083 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:24Z","lastTransitionTime":"2025-09-30T02:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.101885 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.101954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.101975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.102003 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.102025 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.204035 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.204099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.204116 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.204143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.204160 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.306715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.306781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.306792 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.306808 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.306821 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.409636 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.409710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.409733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.409766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.409793 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.512517 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.512569 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.512581 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.512596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.512609 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.614682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.614721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.614730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.614743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.614752 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.717684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.717741 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.717761 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.717781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.717791 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.820405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.820448 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.820464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.820479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.820490 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.922602 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.922656 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.922667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.922689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:25 crc kubenswrapper[4744]: I0930 02:55:25.922701 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:25Z","lastTransitionTime":"2025-09-30T02:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.025016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.025089 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.025113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.025142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.025166 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.127625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.127664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.127674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.127689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.127698 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.229884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.229929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.229941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.229958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.229972 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.332994 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.333024 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.333032 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.333045 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.333054 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.436013 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.436085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.436105 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.436134 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.436161 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.503126 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.503213 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:26 crc kubenswrapper[4744]: E0930 02:55:26.503276 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.503323 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:26 crc kubenswrapper[4744]: E0930 02:55:26.503382 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.503584 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:26 crc kubenswrapper[4744]: E0930 02:55:26.503656 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:26 crc kubenswrapper[4744]: E0930 02:55:26.503908 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.539228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.539282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.539296 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.539319 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.539334 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.641958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.642048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.642066 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.642117 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.642133 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.744322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.744387 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.744397 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.744414 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.744423 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.846805 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.846846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.846856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.846872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.846883 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.949839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.949899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.949917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.949947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:26 crc kubenswrapper[4744]: I0930 02:55:26.949970 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:26Z","lastTransitionTime":"2025-09-30T02:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.053232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.053274 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.053283 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.053298 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.053308 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.155551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.155658 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.155677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.155697 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.155715 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.257861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.257909 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.257922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.257939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.257950 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.359511 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.359575 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.359595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.359623 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.359641 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.462519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.462580 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.462596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.462622 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.462638 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.565014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.565086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.565111 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.565147 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.565169 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.668087 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.668127 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.668135 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.668152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.668162 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.771718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.771773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.771791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.771817 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.771837 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.884344 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.884434 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.884447 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.884460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.884468 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.987497 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.987611 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.987642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.987682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:27 crc kubenswrapper[4744]: I0930 02:55:27.987709 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:27Z","lastTransitionTime":"2025-09-30T02:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.090169 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.090234 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.090251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.090284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.090303 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.192642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.192698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.192712 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.192732 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.192743 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.294715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.294752 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.294762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.294777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.294787 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.397266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.397312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.397324 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.397340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.397352 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.500494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.500562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.500585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.500618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.500640 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.502817 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.502904 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:28 crc kubenswrapper[4744]: E0930 02:55:28.503017 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.503036 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.503098 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:28 crc kubenswrapper[4744]: E0930 02:55:28.503300 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:28 crc kubenswrapper[4744]: E0930 02:55:28.503430 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:28 crc kubenswrapper[4744]: E0930 02:55:28.503558 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.591277 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:28 crc kubenswrapper[4744]: E0930 02:55:28.591534 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:28 crc kubenswrapper[4744]: E0930 02:55:28.591656 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:55:44.591625069 +0000 UTC m=+71.764845113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.603809 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.603866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.603889 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.603919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.603939 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.706992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.707055 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.707087 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.707119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.707143 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.809980 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.810048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.810083 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.810115 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.810134 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.912977 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.913017 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.913028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.913043 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:28 crc kubenswrapper[4744]: I0930 02:55:28.913054 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:28Z","lastTransitionTime":"2025-09-30T02:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.016294 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.016402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.016429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.016460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.016484 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.118634 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.118677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.118686 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.118701 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.118712 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.221783 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.221847 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.221866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.221892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.221911 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.324830 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.324961 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.324975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.324992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.325003 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.428247 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.428323 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.428347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.428410 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.428438 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.530831 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.530875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.530886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.530900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.530912 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.633054 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.633351 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.633377 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.633426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.633439 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.736067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.736118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.736132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.736151 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.736165 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.839426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.839477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.839490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.839510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.839522 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.942203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.942256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.942271 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.942295 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:29 crc kubenswrapper[4744]: I0930 02:55:29.942309 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:29Z","lastTransitionTime":"2025-09-30T02:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.044683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.044729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.044743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.044765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.044780 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.052972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.053012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.053022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.053038 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.053049 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.072856 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:30Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.077153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.077182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.077191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.077206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.077218 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.090976 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:30Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.095314 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.095359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.095386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.095401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.095411 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.113880 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:30Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.118681 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.118732 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.118752 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.118777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.118793 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.139900 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:30Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.144408 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.144461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.144476 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.144501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.144514 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.157946 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:30Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.158074 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.159618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.159659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.159671 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.159687 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.159700 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.262791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.262852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.262872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.262899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.262920 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.366871 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.366930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.366946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.366971 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.366989 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.470740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.470898 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.470926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.470959 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.470979 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.503183 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.503236 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.503242 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.503206 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.503467 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.503745 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.503786 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:30 crc kubenswrapper[4744]: E0930 02:55:30.503999 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.574828 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.574893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.574912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.574940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.574959 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.679282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.679343 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.679360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.679422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.679445 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.783569 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.783626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.783661 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.783693 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.783711 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.886469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.886545 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.886566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.886603 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.886634 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.989326 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.989393 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.989406 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.989426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:30 crc kubenswrapper[4744]: I0930 02:55:30.989439 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:30Z","lastTransitionTime":"2025-09-30T02:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.094346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.094427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.094445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.094472 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.094492 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.197554 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.197633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.197652 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.197682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.197702 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.301090 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.301182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.301209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.301243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.301269 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.403976 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.404021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.404032 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.404054 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.404066 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.504002 4744 scope.go:117] "RemoveContainer" containerID="28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.507062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.507164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.507181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.507199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.507294 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.611410 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.611474 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.611494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.611520 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.611537 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.717077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.717126 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.717143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.717169 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.717185 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.819672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.819721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.819737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.819761 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.819778 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.900508 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/1.log" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.904025 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.905161 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.923855 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:31Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.924190 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.924259 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.924271 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.924290 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.924301 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:31Z","lastTransitionTime":"2025-09-30T02:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.941411 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:31Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.953497 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:31Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.967307 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:31Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:31 crc kubenswrapper[4744]: I0930 02:55:31.984040 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:31Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.011121 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.027659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.027733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.027754 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.027788 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.027809 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.035416 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.064418 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.077332 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.108956 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.125276 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.130013 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.130057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.130076 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.130100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.130114 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.143283 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.167487 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.183361 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.206738 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.224117 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.233305 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.233362 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.233396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.233416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.233429 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.238938 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.255318 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.335834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.335913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.335928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.335951 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.335964 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.438411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.438490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.438510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.438538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.438561 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.503595 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.503647 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.503692 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.503613 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:32 crc kubenswrapper[4744]: E0930 02:55:32.503792 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:32 crc kubenswrapper[4744]: E0930 02:55:32.503905 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:32 crc kubenswrapper[4744]: E0930 02:55:32.503976 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:32 crc kubenswrapper[4744]: E0930 02:55:32.504031 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.541322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.541412 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.541428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.541453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.541468 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.644134 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.644189 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.644200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.644224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.644240 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.748942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.749035 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.749056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.749084 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.749109 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.852750 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.852818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.852834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.852856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.852869 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.911020 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/2.log" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.912013 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/1.log" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.916784 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9" exitCode=1 Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.916838 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.916890 4744 scope.go:117] "RemoveContainer" containerID="28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.918659 4744 scope.go:117] "RemoveContainer" containerID="f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9" Sep 30 02:55:32 crc kubenswrapper[4744]: E0930 02:55:32.918955 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.935404 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.955394 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.955492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.955519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.955564 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.955590 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:32Z","lastTransitionTime":"2025-09-30T02:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.961426 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.977952 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:32 crc kubenswrapper[4744]: I0930 02:55:32.993588 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.007727 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.022068 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.035609 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.056918 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.059011 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.059052 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.059067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.059087 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.059100 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.070961 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.094210 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.106429 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.116727 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.127538 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.139997 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.149715 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.160285 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.162386 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.162438 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.162448 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.162465 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.162475 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.170100 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.184307 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.265840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.265887 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.265896 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.265914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.265923 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.375756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.375927 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.376010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.376051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.376083 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.478109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.478700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.478760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.478824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.478879 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.521335 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.543426 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.556265 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.571698 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.582565 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.582599 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.582633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.582654 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.582666 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.585275 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.615354 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28368ed23ecd1ba0b27daf68f441c2152343e2582edcede473004b9a29d2eb84\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:08Z\\\",\\\"message\\\":\\\"/factory.go:140\\\\nI0930 02:55:08.526953 6192 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.526998 6192 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 02:55:08.527106 6192 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 02:55:08.534579 6192 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 02:55:08.534665 6192 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 02:55:08.534697 6192 factory.go:656] Stopping watch factory\\\\nI0930 02:55:08.534705 6192 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 02:55:08.534742 6192 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 02:55:08.538888 6192 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 02:55:08.538950 6192 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 02:55:08.539054 6192 ovnkube.go:599] Stopped ovnkube\\\\nI0930 02:55:08.539110 6192 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 02:55:08.539262 6192 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.634071 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.656205 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.676760 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.686170 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.686232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.686244 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.686287 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.686301 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.691119 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.708734 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.727111 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.742164 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.755497 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.771545 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.785560 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.788942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.789008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.789041 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.789063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.789073 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.802907 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.820801 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.891839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.891895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.891908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.891929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.891943 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.923904 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/2.log" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.930006 4744 scope.go:117] "RemoveContainer" containerID="f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9" Sep 30 02:55:33 crc kubenswrapper[4744]: E0930 02:55:33.930463 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.954684 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.974205 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.996404 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.996460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.996474 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.996495 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.996508 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:33Z","lastTransitionTime":"2025-09-30T02:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:33 crc kubenswrapper[4744]: I0930 02:55:33.998595 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:33Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.011481 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.035025 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.047557 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.060764 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.080119 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.099689 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.101555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.101659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.101674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.101696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.101735 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.123352 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.145229 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.170334 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.189983 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.204316 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.204405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.204422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.204445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.204456 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.226892 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.250542 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.273006 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.305099 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.306925 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.306966 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.306975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.306991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.307001 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.324006 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:34Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.409990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.410038 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.410050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.410065 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.410077 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.503415 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.503511 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.503523 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.503523 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:34 crc kubenswrapper[4744]: E0930 02:55:34.503694 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:34 crc kubenswrapper[4744]: E0930 02:55:34.503891 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:34 crc kubenswrapper[4744]: E0930 02:55:34.503995 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:34 crc kubenswrapper[4744]: E0930 02:55:34.504133 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.512839 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.512884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.512892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.512907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.512916 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.616335 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.616460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.616480 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.616513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.616533 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.719207 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.719257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.719266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.719282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.719291 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.823197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.823318 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.823336 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.823401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.823422 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.926951 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.927010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.927022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.927039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:34 crc kubenswrapper[4744]: I0930 02:55:34.927049 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:34Z","lastTransitionTime":"2025-09-30T02:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.030978 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.031051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.031067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.031097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.031115 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.134311 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.134445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.134466 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.134496 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.134517 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.238717 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.238933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.238953 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.238985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.239006 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.342315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.342455 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.342492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.342531 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.342558 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.445976 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.446044 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.446062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.446092 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.446118 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.549525 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.549581 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.549590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.549613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.549625 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.652743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.652825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.652850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.652883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.652906 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.756772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.756843 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.756862 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.756894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.756917 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.860808 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.860868 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.860879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.860898 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.860914 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.964507 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.964593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.964604 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.964625 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:35 crc kubenswrapper[4744]: I0930 02:55:35.964637 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:35Z","lastTransitionTime":"2025-09-30T02:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.067312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.067629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.067699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.067779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.067843 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.171642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.172073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.172216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.172464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.172645 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.274989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.275247 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.275314 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.275413 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.275494 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.378362 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.378649 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.378716 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.379053 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.379122 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.481694 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.482310 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.482433 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.482536 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.482615 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.502745 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:36 crc kubenswrapper[4744]: E0930 02:55:36.503185 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.503528 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.503697 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.503697 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:36 crc kubenswrapper[4744]: E0930 02:55:36.503868 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:36 crc kubenswrapper[4744]: E0930 02:55:36.503969 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:36 crc kubenswrapper[4744]: E0930 02:55:36.504070 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.584740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.584784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.584795 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.584840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.584855 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.688120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.688197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.688211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.688230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.688241 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.790936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.790999 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.791010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.791030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.791040 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.893107 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.893143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.893153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.893168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.893177 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.995821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.995856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.995864 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.995880 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:36 crc kubenswrapper[4744]: I0930 02:55:36.995892 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:36Z","lastTransitionTime":"2025-09-30T02:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.097895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.097928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.097957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.097972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.097983 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.200106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.200153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.200166 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.200185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.200196 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.302444 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.302486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.302496 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.302513 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.302524 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.405388 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.405459 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.405471 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.405487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.405499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.507814 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.507887 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.508048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.508097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.508119 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.610244 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.610300 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.610310 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.610328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.610340 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.712546 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.712598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.712608 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.712624 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.712634 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.816200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.816251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.816260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.816279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.816290 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.918345 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.918403 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.918416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.918432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:37 crc kubenswrapper[4744]: I0930 02:55:37.918441 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:37Z","lastTransitionTime":"2025-09-30T02:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.020998 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.021065 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.021084 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.021109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.021141 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.125425 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.125502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.125521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.125551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.125574 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.229056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.229112 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.229122 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.229142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.229154 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.335237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.335312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.335349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.335404 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.335427 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.438807 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.438873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.438883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.438903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.438915 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.502739 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.502855 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.502859 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:38 crc kubenswrapper[4744]: E0930 02:55:38.502920 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:38 crc kubenswrapper[4744]: E0930 02:55:38.503074 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.503187 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:38 crc kubenswrapper[4744]: E0930 02:55:38.503242 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:38 crc kubenswrapper[4744]: E0930 02:55:38.503420 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.541668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.541714 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.541728 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.541749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.541769 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.644845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.644908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.644922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.644944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.644961 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.748873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.748943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.748969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.749000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.749022 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.852508 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.852562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.852573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.852593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.852606 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.959142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.959196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.959210 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.959233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:38 crc kubenswrapper[4744]: I0930 02:55:38.959245 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:38Z","lastTransitionTime":"2025-09-30T02:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.062856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.062907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.062920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.062939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.062951 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.165768 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.165825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.165836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.165854 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.165869 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.271156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.271217 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.271228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.271245 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.271257 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.375341 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.375618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.375636 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.375668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.375688 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.479412 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.479486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.479503 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.479537 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.479555 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.583168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.583225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.583238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.583261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.583274 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.686732 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.686790 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.686800 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.686821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.686834 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.790615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.790663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.790673 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.790692 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.790704 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.893883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.893964 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.893991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.894020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.894040 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.996917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.996987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.997009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.997039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:39 crc kubenswrapper[4744]: I0930 02:55:39.997058 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:39Z","lastTransitionTime":"2025-09-30T02:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.100105 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.100157 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.100170 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.100190 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.100205 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.204089 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.204160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.204185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.204212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.204234 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.307560 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.307638 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.307652 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.307677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.307693 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.410863 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.410940 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.410958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.410989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.411011 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.415583 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.415630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.415647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.415667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.415683 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.437661 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:40Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.443616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.443666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.443685 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.443712 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.443737 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.483765 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:40Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.489640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.489701 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.489715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.489736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.489747 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.503247 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.503479 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.503532 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.503622 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.504203 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.504209 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.504499 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.504734 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.510164 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:40Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.515428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.515524 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.515554 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.515590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.515614 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.531805 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:40Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.536603 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.536648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.536660 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.536681 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.536697 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.552865 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:40Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:40 crc kubenswrapper[4744]: E0930 02:55:40.552991 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.554767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.554806 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.554819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.554842 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.554856 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.657719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.657761 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.657771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.657791 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.657806 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.760894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.760941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.760952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.760968 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.760977 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.863387 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.863453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.863464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.863483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.863497 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.966639 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.966698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.966712 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.966739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:40 crc kubenswrapper[4744]: I0930 02:55:40.966756 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:40Z","lastTransitionTime":"2025-09-30T02:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.069641 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.069719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.069750 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.069785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.069814 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.174191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.174268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.174299 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.174337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.174403 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.279279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.279359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.279447 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.279483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.279508 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.383693 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.383752 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.383764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.383782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.383797 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.486766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.486807 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.486815 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.486829 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.486837 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.589211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.589257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.589266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.589287 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.589297 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.691420 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.691470 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.691482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.691499 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.691509 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.794989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.795042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.795051 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.795068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.795079 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.898197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.898245 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.898257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.898281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:41 crc kubenswrapper[4744]: I0930 02:55:41.898294 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:41Z","lastTransitionTime":"2025-09-30T02:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.000715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.000760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.000771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.000786 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.000795 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.103507 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.103544 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.103556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.103571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.103581 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.205479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.205512 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.205521 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.205535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.205543 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.307722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.307779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.307794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.307814 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.307830 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.409936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.410015 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.410025 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.410040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.410050 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.503184 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.503239 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.503289 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:42 crc kubenswrapper[4744]: E0930 02:55:42.503324 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.503193 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:42 crc kubenswrapper[4744]: E0930 02:55:42.503543 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:42 crc kubenswrapper[4744]: E0930 02:55:42.503569 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:42 crc kubenswrapper[4744]: E0930 02:55:42.503633 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.512469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.512508 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.512518 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.512531 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.512544 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.614778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.614832 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.614840 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.614857 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.614868 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.718214 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.718266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.718276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.718293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.718303 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.820916 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.820985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.821004 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.821032 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.821053 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.924464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.924510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.924526 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.924546 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:42 crc kubenswrapper[4744]: I0930 02:55:42.924556 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:42Z","lastTransitionTime":"2025-09-30T02:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.027570 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.027621 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.027632 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.027655 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.027665 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.131396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.131464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.131483 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.131514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.131538 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.233824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.233878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.233888 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.233908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.233920 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.336224 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.336270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.336280 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.336297 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.336307 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.439655 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.439707 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.439715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.439731 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.439742 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.516970 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.531150 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.542724 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.542764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.542776 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.542794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.542806 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.576600 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.613797 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.631339 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.644174 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.644555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.644578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.644587 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.644602 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.644610 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.657433 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.669676 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.684600 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.698044 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.711029 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.726797 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.740216 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.747146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.747187 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.747204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.747226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.747242 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.764260 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.781029 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.795951 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.813359 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.828208 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:43Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.855578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.855659 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.855679 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.856098 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.856158 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.959773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.959834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.959844 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.959859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:43 crc kubenswrapper[4744]: I0930 02:55:43.959870 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:43Z","lastTransitionTime":"2025-09-30T02:55:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.063946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.064017 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.064037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.064518 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.064560 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.168236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.168304 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.168315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.168333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.168358 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.272244 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.272316 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.272335 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.272395 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.272417 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.376127 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.376519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.376723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.376868 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.377013 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.480861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.480919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.480931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.480949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.480963 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.503618 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.503618 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.503659 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.504079 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:44 crc kubenswrapper[4744]: E0930 02:55:44.504404 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:44 crc kubenswrapper[4744]: E0930 02:55:44.504536 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:44 crc kubenswrapper[4744]: E0930 02:55:44.504627 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:44 crc kubenswrapper[4744]: E0930 02:55:44.504690 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.584605 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.584654 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.584667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.584688 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.584703 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.678246 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:44 crc kubenswrapper[4744]: E0930 02:55:44.678909 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:44 crc kubenswrapper[4744]: E0930 02:55:44.679249 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:56:16.679219386 +0000 UTC m=+103.852439390 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.687737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.687810 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.687822 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.687865 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.687882 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.790618 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.790676 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.790693 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.790719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.790741 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.894532 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.894591 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.894607 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.894638 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.894657 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.998624 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.998699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.998720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.998749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:44 crc kubenswrapper[4744]: I0930 02:55:44.998766 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:44Z","lastTransitionTime":"2025-09-30T02:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.103175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.103481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.103585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.103666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.103744 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.206784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.207050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.207128 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.207199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.207253 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.309788 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.310062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.310141 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.310220 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.310292 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.412152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.412193 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.412205 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.412221 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.412230 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.515145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.515260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.515278 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.515307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.515329 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.618948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.619264 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.619364 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.619469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.619560 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.723185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.723713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.723936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.724096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.724255 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.828049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.828097 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.828110 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.828131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.828144 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.932756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.933430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.933673 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.933849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:45 crc kubenswrapper[4744]: I0930 02:55:45.933990 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:45Z","lastTransitionTime":"2025-09-30T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.037960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.038020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.038031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.038069 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.038079 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.143084 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.143225 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.143243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.143272 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.143289 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.245789 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.245844 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.245856 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.245875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.245888 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.349827 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.349936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.349956 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.349984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.350004 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.453231 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.453302 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.453317 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.453337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.453354 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.503314 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.503365 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.503335 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.503314 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:46 crc kubenswrapper[4744]: E0930 02:55:46.503548 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:46 crc kubenswrapper[4744]: E0930 02:55:46.503703 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:46 crc kubenswrapper[4744]: E0930 02:55:46.503898 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:46 crc kubenswrapper[4744]: E0930 02:55:46.504147 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.555322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.555489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.555519 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.555557 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.555583 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.658348 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.658617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.658690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.658784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.658866 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.763237 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.763307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.763321 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.763340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.763353 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.867332 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.867930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.868000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.868075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.868162 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.970925 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.970974 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.970988 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.971009 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:46 crc kubenswrapper[4744]: I0930 02:55:46.971023 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:46Z","lastTransitionTime":"2025-09-30T02:55:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.073647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.073738 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.073758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.073783 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.073798 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.177540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.177623 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.177642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.177671 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.177691 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.280674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.280934 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.281006 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.281085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.281149 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.385404 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.385464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.385482 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.385502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.385514 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.495072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.495139 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.495149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.495170 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.495181 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.598263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.598325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.598337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.598361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.598397 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.702549 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.703019 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.703159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.703317 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.703485 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.807570 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.807650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.807668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.807698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.807721 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.911949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.912048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.912074 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.912109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.912133 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:47Z","lastTransitionTime":"2025-09-30T02:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.982814 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/0.log" Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.982929 4744 generic.go:334] "Generic (PLEG): container finished" podID="6561e3c6-a8d1-4dc8-8bd3-09f042393658" containerID="d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8" exitCode=1 Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.982989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerDied","Data":"d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8"} Sep 30 02:55:47 crc kubenswrapper[4744]: I0930 02:55:47.983615 4744 scope.go:117] "RemoveContainer" containerID="d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.015547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.015929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.015948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.015976 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.015998 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.024046 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.049179 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.075007 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.104210 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.122005 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.122072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.122085 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.122109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.122123 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.129782 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.152721 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.171853 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.192681 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.207582 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.226014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.226062 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.226074 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.226096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.226110 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.227321 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.243785 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.259838 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.281654 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.300744 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.323499 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.329667 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.329719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.329733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.329758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.329770 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.341006 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.359334 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.372593 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:48Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.432079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.432147 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.432164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.432186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.432199 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.503196 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.503294 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.503201 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.503200 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:48 crc kubenswrapper[4744]: E0930 02:55:48.503405 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:48 crc kubenswrapper[4744]: E0930 02:55:48.503520 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:48 crc kubenswrapper[4744]: E0930 02:55:48.504076 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:48 crc kubenswrapper[4744]: E0930 02:55:48.504188 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.504575 4744 scope.go:117] "RemoveContainer" containerID="f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9" Sep 30 02:55:48 crc kubenswrapper[4744]: E0930 02:55:48.504779 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.534214 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.534260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.534275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.534299 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.534315 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.637778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.637834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.637847 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.637869 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.637883 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.741448 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.741528 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.741542 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.741562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.741575 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.844203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.844259 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.844270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.844295 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.844305 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.948012 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.948083 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.948102 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.948130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.948149 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:48Z","lastTransitionTime":"2025-09-30T02:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.988893 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/0.log" Sep 30 02:55:48 crc kubenswrapper[4744]: I0930 02:55:48.988998 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerStarted","Data":"cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.009416 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.028229 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.045923 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.051698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.051767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.051778 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.051799 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.051836 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.062199 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.082934 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.127356 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.146405 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.154551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.154632 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.154658 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.154694 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.154721 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.167143 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.191463 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.225015 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.249256 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.258204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.258259 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.258279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.258307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.258326 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.284043 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.308323 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.325564 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.338964 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.362055 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.362140 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.362157 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.362186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.362201 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.362768 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.384251 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.409592 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:49Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.465909 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.465978 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.465996 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.466023 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.466044 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.569147 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.569201 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.569214 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.569241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.569262 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.672645 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.672738 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.672758 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.672785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.672804 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.775991 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.776049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.776068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.776099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.776123 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.878623 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.878670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.878684 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.878704 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.878717 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.981547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.981595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.981609 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.981628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:49 crc kubenswrapper[4744]: I0930 02:55:49.981638 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:49Z","lastTransitionTime":"2025-09-30T02:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.084683 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.084733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.084743 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.084765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.084775 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.188494 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.188538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.188549 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.188571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.188583 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.292091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.292185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.292212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.292240 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.292258 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.395824 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.395905 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.395924 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.395950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.395971 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.499520 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.499976 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.500118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.500260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.500521 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.502982 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.503025 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.503023 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.503127 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.503257 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.503656 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.503816 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.503902 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.603869 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.603913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.603942 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.603963 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.603973 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.706419 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.706718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.706787 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.706881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.706943 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.810039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.810185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.810200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.810223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.810243 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.880687 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.880765 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.880781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.880798 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.880808 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.896755 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:50Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.901930 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.901979 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.901989 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.902006 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.902017 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.916549 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:50Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.921421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.921457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.921469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.921489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.921502 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.938517 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:50Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.943196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.943517 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.943742 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.943931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.944133 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.966432 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:50Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.972883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.972932 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.972943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.972967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.972981 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.991992 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:50Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:50 crc kubenswrapper[4744]: E0930 02:55:50.992118 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.994294 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.994327 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.994338 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.994357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:50 crc kubenswrapper[4744]: I0930 02:55:50.994382 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:50Z","lastTransitionTime":"2025-09-30T02:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.097505 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.097565 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.097578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.097603 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.097617 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.200180 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.200282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.200306 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.200361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.200468 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.304066 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.304118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.304126 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.304144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.304158 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.407276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.407341 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.407357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.407402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.407418 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.509257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.509326 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.509349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.509409 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.509432 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.612965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.613050 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.613070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.613100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.613133 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.715938 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.716015 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.716030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.716073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.716094 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.818178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.818260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.818288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.818323 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.818342 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.922199 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.922285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.922306 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.922339 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:51 crc kubenswrapper[4744]: I0930 02:55:51.922362 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:51Z","lastTransitionTime":"2025-09-30T02:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.025566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.025645 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.025668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.025695 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.025719 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.128990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.129076 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.129094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.129120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.129144 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.232414 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.232462 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.232471 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.232487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.232496 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.335103 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.335170 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.335181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.335201 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.335213 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.438071 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.438169 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.438193 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.438221 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.438239 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.503627 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.503663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.503760 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.503643 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:52 crc kubenswrapper[4744]: E0930 02:55:52.503815 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:52 crc kubenswrapper[4744]: E0930 02:55:52.503945 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:52 crc kubenswrapper[4744]: E0930 02:55:52.504031 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:52 crc kubenswrapper[4744]: E0930 02:55:52.504238 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.541251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.541322 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.541336 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.541358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.541393 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.643632 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.643708 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.643735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.643764 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.643786 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.747422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.747466 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.747475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.747496 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.747507 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.850106 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.850153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.850162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.850179 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.850192 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.952202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.952246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.952255 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.952272 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:52 crc kubenswrapper[4744]: I0930 02:55:52.952284 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:52Z","lastTransitionTime":"2025-09-30T02:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.054902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.054957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.054975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.055001 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.055071 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.157802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.157860 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.157874 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.157901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.157919 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.260759 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.260907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.260921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.260943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.260957 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.363139 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.363185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.363194 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.363212 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.363221 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.466779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.466842 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.466854 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.466879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.466894 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.519337 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.542542 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.557959 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.570194 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.570270 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.570293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.570325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.570352 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.573850 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.608088 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.623174 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.635843 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.647777 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.659245 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.672056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.672093 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.672105 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.672142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.672155 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.672347 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.682815 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.692277 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.703560 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.718856 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.731241 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.744198 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.756041 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.771951 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:53Z is after 2025-08-24T17:21:41Z" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.775438 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.775629 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.775643 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.775666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.775681 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.878874 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.878945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.878960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.878985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.879001 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.981114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.981153 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.981165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.981184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:53 crc kubenswrapper[4744]: I0930 02:55:53.981195 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:53Z","lastTransitionTime":"2025-09-30T02:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.083385 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.083440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.083451 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.083475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.083487 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.186301 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.186358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.186405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.186431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.186447 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.289202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.289251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.289261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.289282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.289294 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.391100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.391135 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.391143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.391159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.391168 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.493436 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.493477 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.493486 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.493501 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.493510 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.502949 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:54 crc kubenswrapper[4744]: E0930 02:55:54.503084 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.502960 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:54 crc kubenswrapper[4744]: E0930 02:55:54.503150 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.502961 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:54 crc kubenswrapper[4744]: E0930 02:55:54.503204 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.502949 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:54 crc kubenswrapper[4744]: E0930 02:55:54.503260 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.596221 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.596266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.596276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.596293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.596302 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.699075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.699131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.699142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.699164 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.699175 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.801672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.801710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.801719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.801739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.801750 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.904901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.904939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.904948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.904963 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:54 crc kubenswrapper[4744]: I0930 02:55:54.904972 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:54Z","lastTransitionTime":"2025-09-30T02:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.007866 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.007913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.007926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.007946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.007961 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.111003 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.111073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.111092 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.111113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.111152 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.214931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.214992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.215010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.215037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.215055 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.318208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.318315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.318334 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.318430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.318499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.422735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.422780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.422795 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.422818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.422834 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.526691 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.526740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.526753 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.526776 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.526787 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.629302 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.629338 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.629363 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.629394 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.629404 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.733155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.733226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.733251 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.733281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.733303 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.835980 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.836285 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.836434 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.836564 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.836656 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.939578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.939633 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.939650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.939680 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:55 crc kubenswrapper[4744]: I0930 02:55:55.939704 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:55Z","lastTransitionTime":"2025-09-30T02:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.043227 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.043269 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.043278 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.043299 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.043309 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.146156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.146229 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.146306 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.146342 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.146395 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.248923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.248958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.248968 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.248985 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.248996 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.351722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.351757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.351767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.351785 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.351795 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.454533 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.454568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.454577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.454597 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.454608 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.502654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.502785 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.502997 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.503312 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.503426 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.503532 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.503744 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.503855 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.528988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.529141 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:00.52912221 +0000 UTC m=+147.702342184 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.529136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.529231 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.529261 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:57:00.529253694 +0000 UTC m=+147.702473668 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.529250 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.529391 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.529440 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:57:00.52942697 +0000 UTC m=+147.702646944 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.557487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.557547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.557563 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.557592 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.557614 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.630046 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.630285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630460 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630484 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630498 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630563 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:57:00.630548914 +0000 UTC m=+147.803768888 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630628 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630685 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630710 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:56 crc kubenswrapper[4744]: E0930 02:55:56.630803 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:57:00.63077468 +0000 UTC m=+147.803994694 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.659825 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.659900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.659922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.659948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.659966 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.763670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.763740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.763872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.763909 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.763926 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.866878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.867001 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.867024 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.867049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.867060 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.969806 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.969860 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.969874 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.969894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:56 crc kubenswrapper[4744]: I0930 02:55:56.969907 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:56Z","lastTransitionTime":"2025-09-30T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.073324 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.073402 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.073413 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.073432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.073442 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.175907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.175965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.175974 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.175992 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.176003 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.278908 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.278953 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.278962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.278978 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.278987 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.380987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.381019 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.381030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.381047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.381056 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.484068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.484131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.484146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.484168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.484183 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.586311 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.586361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.586400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.586421 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.586433 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.689936 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.690006 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.690020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.690044 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.690057 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.792404 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.792447 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.792456 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.792474 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.792483 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.895709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.895773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.895790 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.895815 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.895833 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.998748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.998819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.998841 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.998875 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:57 crc kubenswrapper[4744]: I0930 02:55:57.998895 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:57Z","lastTransitionTime":"2025-09-30T02:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.101348 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.101409 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.101422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.101442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.101454 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.204035 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.204078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.204091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.204107 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.204117 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.307394 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.310735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.310784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.310831 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.310857 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.414271 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.414330 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.414339 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.414358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.414426 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.503450 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.503512 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:55:58 crc kubenswrapper[4744]: E0930 02:55:58.503594 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.503450 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.503459 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:55:58 crc kubenswrapper[4744]: E0930 02:55:58.503718 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:55:58 crc kubenswrapper[4744]: E0930 02:55:58.503917 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:55:58 crc kubenswrapper[4744]: E0930 02:55:58.504080 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.518088 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.518145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.518161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.518185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.518203 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.621514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.621690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.621707 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.621729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.621741 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.724299 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.724349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.724360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.724401 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.724412 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.826710 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.826767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.826782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.826802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.826815 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.930045 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.930142 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.930160 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.930185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:58 crc kubenswrapper[4744]: I0930 02:55:58.930197 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:58Z","lastTransitionTime":"2025-09-30T02:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.033044 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.033103 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.033119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.033143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.033158 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.135868 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.135914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.135928 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.135948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.135963 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.238621 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.239172 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.239195 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.239215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.239236 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.342114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.342423 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.342445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.342469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.342480 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.445131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.445184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.445196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.445219 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.445233 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.548169 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.548252 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.548276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.548317 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.548338 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.651400 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.651445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.651473 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.651498 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.651515 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.753806 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.753859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.753872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.753890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.753902 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.857118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.857173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.857183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.857200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.857210 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.960551 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.960628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.960638 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.960657 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:55:59 crc kubenswrapper[4744]: I0930 02:55:59.960667 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:55:59Z","lastTransitionTime":"2025-09-30T02:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.063427 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.063467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.063475 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.063490 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.063500 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.166489 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.166542 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.166555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.166577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.166589 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.269330 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.269418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.269431 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.269454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.269467 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.372731 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.372781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.372792 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.372814 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.372827 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.475307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.475384 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.475395 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.475411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.475420 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.502935 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.502985 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:00 crc kubenswrapper[4744]: E0930 02:56:00.503150 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.503163 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.503280 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:00 crc kubenswrapper[4744]: E0930 02:56:00.503276 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:00 crc kubenswrapper[4744]: E0930 02:56:00.503512 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:00 crc kubenswrapper[4744]: E0930 02:56:00.503951 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.519873 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.578879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.578917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.578927 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.578954 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.578967 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.681590 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.681640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.681650 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.681671 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.681684 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.785428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.785488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.785505 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.785530 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.785549 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.888816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.888871 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.888882 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.888900 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.888911 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.991777 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.991855 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.991879 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.991912 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:00 crc kubenswrapper[4744]: I0930 02:56:00.991939 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:00Z","lastTransitionTime":"2025-09-30T02:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.012017 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.012058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.012070 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.012091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.012104 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: E0930 02:56:01.027986 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.031405 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.031440 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.031449 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.031469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.031480 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: E0930 02:56:01.044076 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.048240 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.048354 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.048365 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.048393 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.048403 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: E0930 02:56:01.060932 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.064223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.064266 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.064277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.064298 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.064311 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: E0930 02:56:01.077301 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.080674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.080740 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.080760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.080787 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.080806 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: E0930 02:56:01.094980 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:01Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:01 crc kubenswrapper[4744]: E0930 02:56:01.095185 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.096851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.096927 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.096949 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.096980 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.097003 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.199452 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.199534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.199557 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.199591 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.199613 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.302411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.302506 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.302517 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.302534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.302544 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.405907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.405972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.405995 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.406021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.406038 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.508095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.508146 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.508161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.508182 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.508195 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.611796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.611870 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.611883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.611903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.611919 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.714559 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.714600 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.714612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.714628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.714637 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.817108 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.817172 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.817185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.817220 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.817230 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.919589 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.919638 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.919648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.919666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:01 crc kubenswrapper[4744]: I0930 02:56:01.919676 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:01Z","lastTransitionTime":"2025-09-30T02:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.022647 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.022739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.022762 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.022793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.022809 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.126616 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.127018 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.127031 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.127075 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.127090 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.229836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.229885 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.229894 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.229913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.229923 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.333814 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.333846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.333854 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.333871 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.333933 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.436020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.436080 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.436096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.436121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.436139 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.502637 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.502708 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.502718 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.502655 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:02 crc kubenswrapper[4744]: E0930 02:56:02.502910 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:02 crc kubenswrapper[4744]: E0930 02:56:02.503031 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:02 crc kubenswrapper[4744]: E0930 02:56:02.503258 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:02 crc kubenswrapper[4744]: E0930 02:56:02.503448 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.539210 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.539288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.539307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.539337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.539362 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.642118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.642181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.642195 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.642217 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.642231 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.745158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.745197 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.745206 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.745222 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.745232 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.847941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.848029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.848040 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.848057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.848069 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.950660 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.950695 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.950703 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.950718 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:02 crc kubenswrapper[4744]: I0930 02:56:02.950728 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:02Z","lastTransitionTime":"2025-09-30T02:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.053348 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.053484 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.053508 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.053536 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.053553 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.156594 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.156670 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.156690 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.156719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.156737 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.259030 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.259067 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.259077 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.259095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.259107 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.361668 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.361709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.361717 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.361736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.361750 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.463960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.464021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.464039 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.464063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.464079 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.503340 4744 scope.go:117] "RemoveContainer" containerID="f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.528799 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.546128 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.559697 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.567883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.567937 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.567948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.567967 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.567978 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.570659 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.582745 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.593765 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.605299 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.617485 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.626712 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef4cb3-e2de-4fe0-a918-6742fd6beff7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e9fce98fe817bbde50eb98e6f41741d185892ce2526f72a172ff9fd85e7d4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.639001 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.650386 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.661962 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.671104 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.671159 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.671177 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.671204 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.671221 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.678074 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.688779 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.707183 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.719153 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.729673 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.745988 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.762560 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:03Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.773720 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.773760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.773772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.773820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.773839 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.876272 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.876321 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.876399 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.876422 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.876437 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.979183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.979538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.979550 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.979567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:03 crc kubenswrapper[4744]: I0930 02:56:03.979577 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:03Z","lastTransitionTime":"2025-09-30T02:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.054164 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/2.log" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.057106 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.057609 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.082939 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.082979 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.082990 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.083007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.083017 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.083736 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.095714 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.108287 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.128439 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:56:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.142409 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.154499 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.168348 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.182633 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.185960 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.185998 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.186010 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.186029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.186040 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.195000 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.208810 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.220201 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.230440 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.244358 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.273539 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef4cb3-e2de-4fe0-a918-6742fd6beff7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e9fce98fe817bbde50eb98e6f41741d185892ce2526f72a172ff9fd85e7d4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.288464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.288518 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.288528 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.288549 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.288560 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.292261 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.305961 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.319625 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.334181 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.344062 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:04Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.391243 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.391293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.391305 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.391326 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.391339 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.493293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.493352 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.493360 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.493385 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.493395 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.502627 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.502661 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.502695 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.502901 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:04 crc kubenswrapper[4744]: E0930 02:56:04.502991 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:04 crc kubenswrapper[4744]: E0930 02:56:04.502903 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:04 crc kubenswrapper[4744]: E0930 02:56:04.503206 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:04 crc kubenswrapper[4744]: E0930 02:56:04.503241 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.595514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.595554 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.595562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.595578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.595591 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.697626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.697910 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.697987 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.698060 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.698117 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.800606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.800674 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.800688 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.800733 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.800745 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.902847 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.902913 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.902927 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.902946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:04 crc kubenswrapper[4744]: I0930 02:56:04.902960 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:04Z","lastTransitionTime":"2025-09-30T02:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.005628 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.005947 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.006113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.006248 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.006316 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.061573 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/3.log" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.062243 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/2.log" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.065578 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" exitCode=1 Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.065633 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.065725 4744 scope.go:117] "RemoveContainer" containerID="f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.066672 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 02:56:05 crc kubenswrapper[4744]: E0930 02:56:05.066945 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.079656 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.092011 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.103426 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.108712 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.108775 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.108802 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.108821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.108831 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.117072 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.131167 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.143993 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.158278 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.175936 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.188752 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.203017 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef4cb3-e2de-4fe0-a918-6742fd6beff7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e9fce98fe817bbde50eb98e6f41741d185892ce2526f72a172ff9fd85e7d4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.211835 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.211895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.211904 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.211922 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.211934 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.218760 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.233450 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.252701 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f52c92fac6df3c5a5794bf77b4c6cc7718f20149526ea3fcd0a29d2f5ca11dd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:32Z\\\",\\\"message\\\":\\\"d to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:55:32Z is after 2025-08-24T17:21:41Z]\\\\nI0930 02:55:32.493004 6480 services_controller.go:434] Service openshift-machine-config-operator/machine-config-controller retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-controller openshift-machine-config-operator aa30290d-3a39-43ba-a212-6439bd680987 4486 0 2025-02-23 05:12:25 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-controller] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00791534b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:56:04Z\\\",\\\"message\\\":\\\"vice openshift-machine-api/machine-api-operator-machine-webhook template LB for network=default: []services.LB{}\\\\nI0930 02:56:04.438359 6905 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0930 02:56:04.438349 6905 services_controller.go:451] Built service openshift-marketplace/redhat-operators cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 02:56:04.438384 6905 services_controller.go:452] Built service openshift-marketplace/redhat-operators per-node LB for network=default: []services.LB{}\\\\nI0930 02:56:04.438385 6905 ovnkub\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:56:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.269243 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.292268 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.304566 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.315073 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.315109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.315119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.315137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.315146 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.318971 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.330501 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.345700 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:05Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.417816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.417851 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.417861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.417877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.417887 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.520843 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.520876 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.520884 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.520944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.520955 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.623286 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.623350 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.623359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.623479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.623493 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.725693 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.725737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.725748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.725766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.725779 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.828865 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.828944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.828958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.828980 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.828992 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.932806 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.932859 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.932872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.932890 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:05 crc kubenswrapper[4744]: I0930 02:56:05.932900 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:05Z","lastTransitionTime":"2025-09-30T02:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.036269 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.036361 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.036418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.036454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.036476 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.071523 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/3.log" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.076656 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 02:56:06 crc kubenswrapper[4744]: E0930 02:56:06.076937 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.090320 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.101543 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.111487 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.123921 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.140167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.140240 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.140260 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.140287 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.140311 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.144550 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.157318 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.168724 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef4cb3-e2de-4fe0-a918-6742fd6beff7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e9fce98fe817bbde50eb98e6f41741d185892ce2526f72a172ff9fd85e7d4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.181915 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.195543 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.208979 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.223549 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.242342 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.243861 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.243924 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.243934 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.243951 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.243963 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.259214 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.271341 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.290669 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:56:04Z\\\",\\\"message\\\":\\\"vice openshift-machine-api/machine-api-operator-machine-webhook template LB for network=default: []services.LB{}\\\\nI0930 02:56:04.438359 6905 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0930 02:56:04.438349 6905 services_controller.go:451] Built service openshift-marketplace/redhat-operators cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 02:56:04.438384 6905 services_controller.go:452] Built service openshift-marketplace/redhat-operators per-node LB for network=default: []services.LB{}\\\\nI0930 02:56:04.438385 6905 ovnkub\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:56:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.304613 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.318167 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.331427 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.343269 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:06Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.345946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.346005 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.346033 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.346057 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.346083 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.448958 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.449072 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.449091 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.449119 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.449138 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.502649 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:06 crc kubenswrapper[4744]: E0930 02:56:06.502844 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.502907 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.502949 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:06 crc kubenswrapper[4744]: E0930 02:56:06.503069 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.503095 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:06 crc kubenswrapper[4744]: E0930 02:56:06.503125 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:06 crc kubenswrapper[4744]: E0930 02:56:06.503307 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.552878 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.552919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.552929 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.552945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.552956 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.655907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.655984 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.656003 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.656028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.656043 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.758250 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.758309 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.758320 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.758339 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.758351 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.860612 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.860689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.860706 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.860734 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.860765 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.963555 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.963596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.963607 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.963626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:06 crc kubenswrapper[4744]: I0930 02:56:06.963635 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:06Z","lastTransitionTime":"2025-09-30T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.066032 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.066116 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.066131 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.066154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.066170 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.169566 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.169721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.169748 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.169773 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.169832 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.272701 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.272970 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.273042 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.273120 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.273189 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.375333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.375393 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.375403 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.375418 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.375427 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.479071 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.479196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.479215 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.479246 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.479263 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.581346 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.581428 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.581461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.581478 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.581489 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.684136 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.684181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.684190 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.684209 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.684221 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.786957 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.787007 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.787016 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.787036 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.787046 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.890110 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.890205 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.890223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.890252 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.890272 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.992737 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.992818 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.992833 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.992849 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:07 crc kubenswrapper[4744]: I0930 02:56:07.992861 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:07Z","lastTransitionTime":"2025-09-30T02:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.094578 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.094613 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.094622 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.094636 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.094644 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.198183 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.198238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.198253 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.198278 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.198293 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.301168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.301218 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.301236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.301255 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.301267 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.403780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.403835 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.403846 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.403867 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.403879 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.503208 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.503239 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.503234 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.503208 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:08 crc kubenswrapper[4744]: E0930 02:56:08.503396 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:08 crc kubenswrapper[4744]: E0930 02:56:08.503537 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:08 crc kubenswrapper[4744]: E0930 02:56:08.503556 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:08 crc kubenswrapper[4744]: E0930 02:56:08.503710 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.509838 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.509917 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.509933 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.509952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.509970 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.612265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.612303 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.612311 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.612326 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.612335 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.714706 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.714760 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.714770 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.714788 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.714799 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.817108 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.817161 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.817173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.817190 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.817199 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.920678 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.920757 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.920772 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.920793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:08 crc kubenswrapper[4744]: I0930 02:56:08.920828 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:08Z","lastTransitionTime":"2025-09-30T02:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.023814 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.023881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.023892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.023911 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.023924 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.126307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.126364 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.126390 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.126411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.126424 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.229672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.229730 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.229742 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.229766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.229778 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.332747 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.332804 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.332813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.332834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.332844 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.436268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.436317 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.436329 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.436347 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.436358 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.538817 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.539454 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.539467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.539487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.539499 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.643047 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.643124 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.643150 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.643184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.643210 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.746099 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.746158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.746171 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.746193 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.746204 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.849263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.849325 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.849342 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.849370 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.849424 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.953048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.953113 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.953127 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.953149 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:09 crc kubenswrapper[4744]: I0930 02:56:09.953165 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:09Z","lastTransitionTime":"2025-09-30T02:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.056132 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.056213 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.056233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.056258 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.056279 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.158349 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.158416 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.158426 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.158442 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.158452 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.260640 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.260701 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.260715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.260735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.260749 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.363649 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.364198 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.364362 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.364576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.364722 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.467178 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.467223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.467232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.467248 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.467258 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.502958 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:10 crc kubenswrapper[4744]: E0930 02:56:10.503201 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.503291 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.503430 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:10 crc kubenswrapper[4744]: E0930 02:56:10.503483 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:10 crc kubenswrapper[4744]: E0930 02:56:10.503571 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.503652 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:10 crc kubenswrapper[4744]: E0930 02:56:10.503913 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.569273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.569307 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.569316 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.569334 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.569344 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.672937 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.673348 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.673464 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.673553 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.673643 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.777105 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.777154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.777168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.777187 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.777199 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.880819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.881558 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.881599 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.881626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.881641 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.985223 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.985284 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.985294 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.985313 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:10 crc kubenswrapper[4744]: I0930 02:56:10.985325 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:10Z","lastTransitionTime":"2025-09-30T02:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.088487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.088535 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.088547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.088568 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.088580 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.191429 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.191492 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.191502 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.191520 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.191532 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.295943 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.296036 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.296058 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.296090 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.296123 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.399202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.399263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.399279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.399301 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.399315 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.440277 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.440709 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.440852 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.440946 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.441041 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: E0930 02:56:11.454499 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.458950 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.459100 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.459162 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.459232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.459301 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: E0930 02:56:11.473807 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.479729 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.479781 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.479795 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.479816 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.479830 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: E0930 02:56:11.501131 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.506191 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.506236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.506245 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.506265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.506277 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: E0930 02:56:11.524425 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.529914 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.529999 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.530019 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.530048 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.530073 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: E0930 02:56:11.547874 4744 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T02:56:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ace33109-5427-4ec8-95ec-e0c80b341759\\\",\\\"systemUUID\\\":\\\"02f345d7-bf31-49d0-b2d4-5371ee59f26c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:11Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:11 crc kubenswrapper[4744]: E0930 02:56:11.548107 4744 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.550609 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.550663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.550687 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.550715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.550739 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.653821 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.653881 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.653901 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.653926 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.653944 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.757608 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.757675 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.757696 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.757723 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.757746 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.860715 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.860780 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.860801 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.860853 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.860881 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.963995 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.964049 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.964094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.964118 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:11 crc kubenswrapper[4744]: I0930 02:56:11.964206 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:11Z","lastTransitionTime":"2025-09-30T02:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.067620 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.067672 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.067682 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.067698 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.067709 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.171882 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.172263 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.172353 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.172432 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.172445 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.275226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.275280 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.275293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.275314 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.275331 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.378467 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.378547 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.378567 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.378599 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.378621 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.481121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.481168 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.481181 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.481203 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.481216 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.502730 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.502784 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.502748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:12 crc kubenswrapper[4744]: E0930 02:56:12.502883 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.502729 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:12 crc kubenswrapper[4744]: E0930 02:56:12.502990 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:12 crc kubenswrapper[4744]: E0930 02:56:12.503051 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:12 crc kubenswrapper[4744]: E0930 02:56:12.503114 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.583411 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.583460 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.583472 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.583493 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.583504 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.686862 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.686907 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.686920 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.686941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.686952 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.790580 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.790627 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.790637 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.790654 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.790667 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.894664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.894767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.894793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.894836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.894869 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.998279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.998328 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.998337 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.998357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:12 crc kubenswrapper[4744]: I0930 02:56:12.998367 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:12Z","lastTransitionTime":"2025-09-30T02:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.143694 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.143948 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.143979 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.144014 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.144041 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.248096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.248184 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.248195 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.248216 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.248229 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.352291 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.352345 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.352357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.352396 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.352408 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.455282 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.455344 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.455357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.455382 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.455420 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.521292 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aef4cb3-e2de-4fe0-a918-6742fd6beff7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9e9fce98fe817bbde50eb98e6f41741d185892ce2526f72a172ff9fd85e7d4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39c54d53b59656dc72787dd6c6fe8be56ba81f31d4bf78afbd365ac1569f8a76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.542035 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7869ba7-0bbe-4f5f-972d-f86b685258b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b091e956f723c5f260a281012fb2875a434b250b5afedd1f5589f6baecb117ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd20c6710de5a9e7589d854928a354cecf1163680857e8fa1ad4ebf8926df243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbce3cc6cdfce6b3a54a373e83a3d4180e11433f2d4fdcccbc1a26c184f99051\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45becc47d1380415d054494ad7c13d5be3099a2e2f0a441cfb943eed0786f920\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.559056 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.559128 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.559155 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.559186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.559217 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.562109 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.584700 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.616972 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7011cf3-078f-4c08-bef7-f89fe27e51f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5cea9372d5e45abb5a9f42d75ad1fd354ff59ff619ae75519ed328f25fd6454\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ccf783077dc4aa5378060647f705324b8ea66066c178eb2057c7b4f8bd4f11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6623c979ce7c996368d7486731ad554173504b3d477110a9de8924fa32a0f16c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5aa08e4d785a4ffd1cfc1a9603c1500872b7099fc802f15c6719125d21de684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://967ac4f551abe36d112faaf14221a5af287764ca4acf54fbe9d4f1dfc3d766bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a30dda922eee9a98470c1d5da13f3fbc3f34638c89c441a4fd6a8d75198291e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485983c5cb55c59366767c4d5764c023a75aa2d1cae6d3821492d610ffd39499\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:55:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f425n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-v9lx7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.635577 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zd85c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d91f1289-b199-4e91-9bbd-78ec9a433706\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djtqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zd85c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.663845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.663892 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.663910 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.663934 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.663953 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.670921 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00595208-5d90-4c66-bd0c-2a5b72de2747\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e4cd6a6653e527589c84a7ceb03599ca1dc8551cde49831a466510506b0ac5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fac05d9359a17230d999f9cd7131c2958e105bf30e3355da15b4fe878e836c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe5b48aa95cd9f178f3aa6214d83da0d2600b95a9e9ebba979a258dcab5e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65f8fe73ba1f4d81732b13714d22a8f2d43fcf836ec78d763c75c929cd7f3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9c4ad2716ccafdeb72049c3d5ea48559f2ae4682db2859d599780acfe936da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bdb35c84f357475f90a86fd9a4575e13d61f291c0b64fb482df542b5216ad22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdb4ed7749ea6a1dd3397dbec448197c5694ee4d5c3386d74fa50805a319913\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b262d5364cdd77822fa9b4500f8c06454565e3f11ddb2992c5518a63929fb37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.694672 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0418f19f16edd42078aaa60dcb074e6f967bfbbaffc25a8e4d328f4e0922ef19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.714050 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.746519 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:56:04Z\\\",\\\"message\\\":\\\"vice openshift-machine-api/machine-api-operator-machine-webhook template LB for network=default: []services.LB{}\\\\nI0930 02:56:04.438359 6905 services_controller.go:454] Service openshift-machine-api/machine-api-operator-machine-webhook for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0930 02:56:04.438349 6905 services_controller.go:451] Built service openshift-marketplace/redhat-operators cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0930 02:56:04.438384 6905 services_controller.go:452] Built service openshift-marketplace/redhat-operators per-node LB for network=default: []services.LB{}\\\\nI0930 02:56:04.438385 6905 ovnkub\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:56:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmplx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c5kw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.767952 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.768417 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.768741 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.768903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.769069 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.771959 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nxppc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6561e3c6-a8d1-4dc8-8bd3-09f042393658\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T02:55:47Z\\\",\\\"message\\\":\\\"2025-09-30T02:55:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083\\\\n2025-09-30T02:55:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_11d017c0-6001-440c-bc3c-5b698154c083 to /host/opt/cni/bin/\\\\n2025-09-30T02:55:02Z [verbose] multus-daemon started\\\\n2025-09-30T02:55:02Z [verbose] Readiness Indicator file check\\\\n2025-09-30T02:55:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T02:55:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6nvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nxppc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.797174 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc24dd03-08bd-4be9-b7ed-350597c29ee6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5337e47d1dba7487ea6376302e441a5ab21e363b06879516097f5413b81d553b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2055f2380b431f757f9729da75c3a953eaa88f1a9f39083a2e61eaf789f25a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b43761a0bc09a945a86e0edfceb0c35e237db804f218d8258f9e6eed8ac0c941\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61743a87f0707f2b0e056e0d197d8dbbb41e4e66acba9370d26330c9ff92198e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://530de58a8a4166c92f07de64774d5c5c93f0153835bbad135832f6332a2f9a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4c90010fad6bbbaf6a201935eaf2ad0685d9b8350e09a446549de01ae1fc86c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.822613 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://686e14282758a69c8052e08693972cc6a5db11726d55e24497355343e885e9cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85201830cbb292b6d12637eecd2a44f4ca4cdb1f92da0b6d28aacb41402d0d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.841938 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ff70167cc86e66bc5ee28b5bb001097cdf07af232c6ffe5666c542dc3172a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.858475 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c8f22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"739887b5-daec-4bee-944e-d718b3baebc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a7af812761c4aa8fd20564cb602f29272fcff5d5223d4d1ec38ece8f1a5b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97wsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:58Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c8f22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.873198 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.873265 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.873291 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.873326 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.873353 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.878436 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcd18e5e-d9db-4244-983e-f52319c1aa8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d504387f3fb2a1d00d8c94eb56fd0f731b9f6d36ae1957bb32d438af23c543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0446e398b450d1dd6d18ccf3e57749c3ab165456ee5f8b5dd0a23067c1a29812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c90540f9255c722110f0389493fbf4a253ad560cdfd95404b0157fea05852eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9360d6b9fd0be142fa89e12a898a89ce0224c696c29dc62fb5fb5227fa3e9f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T02:54:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T02:54:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.897643 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0ffb258-115f-4a60-92da-91d4a9036c10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:54:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4664df719d9cd3850dcda46961f0a8d5407886f69a1a13ee9517b57952f6d290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:54:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbn2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:54:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kp8zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.916347 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-twwm8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c27f07dd-93c3-4287-9a8c-c6c0e7724776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58e8d7fc2d152bde8751909c171be6cea1543975c15791b6f1b6bc972e0105e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcpnn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-twwm8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.933750 4744 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23ce5b9f-0fdd-493e-8b6a-078a1a7de1a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T02:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817389ca682e94c187e1240b956d3404f21bd53484a3d37964f17acb949926dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c2b1b88c9de6e214b2c3505b2de2bd57fcbec95854297d289ee3bf72a381bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T02:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bx96k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T02:55:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s92q8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T02:56:13Z is after 2025-08-24T17:21:41Z" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.977129 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.977211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.977226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.977257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:13 crc kubenswrapper[4744]: I0930 02:56:13.977278 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:13Z","lastTransitionTime":"2025-09-30T02:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.081063 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.081121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.081136 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.081158 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.081173 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.184681 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.184749 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.184766 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.184793 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.184811 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.288351 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.288437 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.288458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.288479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.288494 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.391508 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.391600 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.391620 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.391652 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.391673 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.495202 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.495272 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.495299 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.495326 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.495348 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.502795 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.502795 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.502837 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.502970 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:14 crc kubenswrapper[4744]: E0930 02:56:14.503185 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:14 crc kubenswrapper[4744]: E0930 02:56:14.503304 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:14 crc kubenswrapper[4744]: E0930 02:56:14.503422 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:14 crc kubenswrapper[4744]: E0930 02:56:14.503528 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.598359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.598550 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.598571 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.598594 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.598615 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.701534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.701595 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.701619 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.701648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.701675 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.804716 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.804771 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.804789 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.804813 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.804831 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.908281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.908341 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.908353 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.908374 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:14 crc kubenswrapper[4744]: I0930 02:56:14.908404 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:14Z","lastTransitionTime":"2025-09-30T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.012193 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.012269 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.012292 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.012324 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.012352 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.116020 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.116078 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.116096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.116123 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.116142 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.219170 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.219268 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.219288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.219312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.219331 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.323094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.323152 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.323167 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.323187 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.323200 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.427288 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.427458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.427487 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.427514 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.427534 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.530735 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.530779 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.530794 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.530808 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.530820 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.633645 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.633727 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.633751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.633782 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.633802 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.738061 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.738138 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.738151 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.738173 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.738188 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.842029 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.842109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.842127 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.842154 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.842173 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.945843 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.945931 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.945976 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.945998 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:15 crc kubenswrapper[4744]: I0930 02:56:15.946011 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:15Z","lastTransitionTime":"2025-09-30T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.050034 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.050094 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.050112 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.050137 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.050155 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.152756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.152811 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.152830 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.152853 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.152867 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.255820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.255872 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.255883 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.255899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.255911 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.359068 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.359111 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.359121 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.359143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.359156 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.462114 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.462174 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.462187 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.462208 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.462221 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.503465 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.503505 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.503583 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:16 crc kubenswrapper[4744]: E0930 02:56:16.503685 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.503838 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:16 crc kubenswrapper[4744]: E0930 02:56:16.504045 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:16 crc kubenswrapper[4744]: E0930 02:56:16.504232 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:16 crc kubenswrapper[4744]: E0930 02:56:16.504304 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.565185 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.565241 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.565254 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.565273 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.565289 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.668465 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.668509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.668522 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.668538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.668552 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.767574 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:16 crc kubenswrapper[4744]: E0930 02:56:16.767781 4744 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:56:16 crc kubenswrapper[4744]: E0930 02:56:16.767875 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs podName:d91f1289-b199-4e91-9bbd-78ec9a433706 nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.7678494 +0000 UTC m=+167.941069384 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs") pod "network-metrics-daemon-zd85c" (UID: "d91f1289-b199-4e91-9bbd-78ec9a433706") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.771491 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.771585 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.771622 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.771653 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.771674 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.875230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.875315 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.875333 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.875359 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.875416 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.980504 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.980584 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.980597 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.980615 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:16 crc kubenswrapper[4744]: I0930 02:56:16.980629 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:16Z","lastTransitionTime":"2025-09-30T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.084130 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.084699 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.084919 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.085144 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.085354 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.188461 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.189095 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.189283 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.189481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.189639 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.293631 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.293796 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.293820 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.293845 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.293864 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.397834 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.397945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.397969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.398000 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.398022 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.501172 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.501226 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.501238 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.501261 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.501276 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.503582 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 02:56:17 crc kubenswrapper[4744]: E0930 02:56:17.503814 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.604658 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.604722 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.604736 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.604756 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.604770 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.708192 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.708257 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.708276 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.708302 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.708321 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.810895 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.810963 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.810982 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.811008 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.811026 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.913969 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.914037 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.914055 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.914079 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:17 crc kubenswrapper[4744]: I0930 02:56:17.914101 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:17Z","lastTransitionTime":"2025-09-30T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.017232 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.017281 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.017294 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.017319 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.017332 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.120458 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.120534 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.120556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.120586 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.120606 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.222910 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.222962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.222975 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.222999 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.223012 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.326368 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.326436 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.326446 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.326469 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.326769 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.429577 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.429617 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.429630 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.429677 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.429691 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.502876 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:18 crc kubenswrapper[4744]: E0930 02:56:18.503070 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.503170 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.503328 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:18 crc kubenswrapper[4744]: E0930 02:56:18.503450 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:18 crc kubenswrapper[4744]: E0930 02:56:18.503571 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.503975 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:18 crc kubenswrapper[4744]: E0930 02:56:18.504134 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.533481 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.533529 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.533538 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.533556 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.533567 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.637021 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.637109 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.637129 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.637157 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.637182 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.741430 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.741598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.741663 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.741689 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.741706 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.845641 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.845708 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.845721 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.845751 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.845789 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.949529 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.949598 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.949611 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.949636 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:18 crc kubenswrapper[4744]: I0930 02:56:18.949654 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:18Z","lastTransitionTime":"2025-09-30T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.052784 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.052862 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.052877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.052902 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.052926 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.155648 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.156028 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.156096 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.156165 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.156233 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.259145 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.259218 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.259230 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.259249 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.259259 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.362503 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.362562 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.362573 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.362593 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.362603 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.465626 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.465873 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.465965 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.466043 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.466111 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.569944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.570034 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.570061 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.570089 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.570112 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.673357 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.673516 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.673544 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.673576 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.673601 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.777850 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.777916 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.777941 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.777972 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.777996 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.881445 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.881540 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.881569 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.881596 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.881618 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.984559 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.984636 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.984664 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.984700 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:19 crc kubenswrapper[4744]: I0930 02:56:19.984724 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:19Z","lastTransitionTime":"2025-09-30T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.088713 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.088893 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.088982 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.089022 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.089102 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.193139 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.193196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.193211 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.193236 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.193253 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.297503 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.297586 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.297606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.297632 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.297653 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.401133 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.401186 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.401205 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.401228 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.401250 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.502615 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.502732 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.502789 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.503190 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:20 crc kubenswrapper[4744]: E0930 02:56:20.503730 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:20 crc kubenswrapper[4744]: E0930 02:56:20.503891 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:20 crc kubenswrapper[4744]: E0930 02:56:20.504074 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:20 crc kubenswrapper[4744]: E0930 02:56:20.504780 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.505196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.505256 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.505275 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.505302 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.505323 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.609318 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.609436 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.609457 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.609488 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.609511 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.712836 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.712903 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.712921 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.712944 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.712962 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.816621 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.816719 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.816739 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.816767 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.816787 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.919398 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.919456 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.919465 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.919484 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:20 crc kubenswrapper[4744]: I0930 02:56:20.919496 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:20Z","lastTransitionTime":"2025-09-30T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.023086 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.023156 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.023175 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.023196 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.023215 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.125877 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.125935 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.125945 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.125962 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.125974 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.228606 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.228642 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.228651 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.228666 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.228675 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.331819 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.331886 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.331899 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.331923 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.331937 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.435358 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.435435 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.435448 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.435470 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.435484 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.538200 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.538279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.538293 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.538312 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.538330 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.642143 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.642233 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.642253 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.642279 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.642297 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.746340 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.746453 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.746479 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.746510 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.746531 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.790509 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.790620 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.790651 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.790688 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.790709 4744 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T02:56:21Z","lastTransitionTime":"2025-09-30T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.874844 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs"] Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.876119 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.882073 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.882089 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.882423 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.883070 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.931410 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.931481 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.931531 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.931677 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.931732 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:21 crc kubenswrapper[4744]: I0930 02:56:21.951299 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c8f22" podStartSLOduration=83.951275851 podStartE2EDuration="1m23.951275851s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:21.951135166 +0000 UTC m=+109.124355190" watchObservedRunningTime="2025-09-30 02:56:21.951275851 +0000 UTC m=+109.124495825" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.025672 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.025644204 podStartE2EDuration="1m31.025644204s" podCreationTimestamp="2025-09-30 02:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.002319038 +0000 UTC m=+109.175539042" watchObservedRunningTime="2025-09-30 02:56:22.025644204 +0000 UTC m=+109.198864178" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.026141 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podStartSLOduration=84.026136539 podStartE2EDuration="1m24.026136539s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.025771627 +0000 UTC m=+109.198991601" watchObservedRunningTime="2025-09-30 02:56:22.026136539 +0000 UTC m=+109.199356513" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.033010 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.033062 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.033152 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.033184 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.033223 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.033228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.033228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.034143 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.041275 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-twwm8" podStartSLOduration=84.041251873 podStartE2EDuration="1m24.041251873s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.040981214 +0000 UTC m=+109.214201188" watchObservedRunningTime="2025-09-30 02:56:22.041251873 +0000 UTC m=+109.214471847" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.047250 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.054402 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s92q8" podStartSLOduration=84.054384896 podStartE2EDuration="1m24.054384896s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.053351734 +0000 UTC m=+109.226571708" watchObservedRunningTime="2025-09-30 02:56:22.054384896 +0000 UTC m=+109.227604870" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.056157 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ml2rs\" (UID: \"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.090663 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=58.090643719 podStartE2EDuration="58.090643719s" podCreationTimestamp="2025-09-30 02:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.06853617 +0000 UTC m=+109.241756144" watchObservedRunningTime="2025-09-30 02:56:22.090643719 +0000 UTC m=+109.263863693" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.091032 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.091028821 podStartE2EDuration="1m29.091028821s" podCreationTimestamp="2025-09-30 02:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.089847044 +0000 UTC m=+109.263067018" watchObservedRunningTime="2025-09-30 02:56:22.091028821 +0000 UTC m=+109.264248795" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.150566 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v9lx7" podStartSLOduration=84.150542178 podStartE2EDuration="1m24.150542178s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.138861909 +0000 UTC m=+109.312081883" watchObservedRunningTime="2025-09-30 02:56:22.150542178 +0000 UTC m=+109.323762152" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.178398 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.178355691 podStartE2EDuration="22.178355691s" podCreationTimestamp="2025-09-30 02:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.163723943 +0000 UTC m=+109.336943927" watchObservedRunningTime="2025-09-30 02:56:22.178355691 +0000 UTC m=+109.351575665" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.207954 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" Sep 30 02:56:22 crc kubenswrapper[4744]: W0930 02:56:22.219903 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf6991a0_3f2a_47f9_b7c5_b3c6d9666a88.slice/crio-a3ceda4373c299aa804584272c90d6494677b93079c12d5752c529403acc4718 WatchSource:0}: Error finding container a3ceda4373c299aa804584272c90d6494677b93079c12d5752c529403acc4718: Status 404 returned error can't find the container with id a3ceda4373c299aa804584272c90d6494677b93079c12d5752c529403acc4718 Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.260706 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nxppc" podStartSLOduration=84.260684329 podStartE2EDuration="1m24.260684329s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.240722416 +0000 UTC m=+109.413942390" watchObservedRunningTime="2025-09-30 02:56:22.260684329 +0000 UTC m=+109.433904303" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.292802 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=90.292779454 podStartE2EDuration="1m30.292779454s" podCreationTimestamp="2025-09-30 02:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:22.291284728 +0000 UTC m=+109.464504702" watchObservedRunningTime="2025-09-30 02:56:22.292779454 +0000 UTC m=+109.465999428" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.503153 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.503235 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.503174 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:22 crc kubenswrapper[4744]: I0930 02:56:22.503296 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:22 crc kubenswrapper[4744]: E0930 02:56:22.503327 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:22 crc kubenswrapper[4744]: E0930 02:56:22.503488 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:22 crc kubenswrapper[4744]: E0930 02:56:22.503670 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:22 crc kubenswrapper[4744]: E0930 02:56:22.503924 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:23 crc kubenswrapper[4744]: I0930 02:56:23.174563 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" event={"ID":"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88","Type":"ContainerStarted","Data":"147959b07ededcd1653da94faa65c9b68bf5aa9b91a69b8ab518230a74517321"} Sep 30 02:56:23 crc kubenswrapper[4744]: I0930 02:56:23.174654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" event={"ID":"bf6991a0-3f2a-47f9-b7c5-b3c6d9666a88","Type":"ContainerStarted","Data":"a3ceda4373c299aa804584272c90d6494677b93079c12d5752c529403acc4718"} Sep 30 02:56:24 crc kubenswrapper[4744]: I0930 02:56:24.503208 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:24 crc kubenswrapper[4744]: I0930 02:56:24.503269 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:24 crc kubenswrapper[4744]: I0930 02:56:24.503338 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:24 crc kubenswrapper[4744]: E0930 02:56:24.503944 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:24 crc kubenswrapper[4744]: E0930 02:56:24.503721 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:24 crc kubenswrapper[4744]: I0930 02:56:24.503337 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:24 crc kubenswrapper[4744]: E0930 02:56:24.504129 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:24 crc kubenswrapper[4744]: E0930 02:56:24.504255 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:26 crc kubenswrapper[4744]: I0930 02:56:26.502679 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:26 crc kubenswrapper[4744]: I0930 02:56:26.502709 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:26 crc kubenswrapper[4744]: I0930 02:56:26.502679 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:26 crc kubenswrapper[4744]: E0930 02:56:26.502812 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:26 crc kubenswrapper[4744]: I0930 02:56:26.502848 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:26 crc kubenswrapper[4744]: E0930 02:56:26.502877 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:26 crc kubenswrapper[4744]: E0930 02:56:26.502924 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:26 crc kubenswrapper[4744]: E0930 02:56:26.503047 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:28 crc kubenswrapper[4744]: I0930 02:56:28.502548 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:28 crc kubenswrapper[4744]: I0930 02:56:28.502543 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:28 crc kubenswrapper[4744]: I0930 02:56:28.502749 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:28 crc kubenswrapper[4744]: E0930 02:56:28.502681 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:28 crc kubenswrapper[4744]: E0930 02:56:28.502881 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:28 crc kubenswrapper[4744]: I0930 02:56:28.502563 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:28 crc kubenswrapper[4744]: E0930 02:56:28.502965 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:28 crc kubenswrapper[4744]: E0930 02:56:28.503106 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:29 crc kubenswrapper[4744]: I0930 02:56:29.503809 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 02:56:29 crc kubenswrapper[4744]: E0930 02:56:29.503978 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:56:30 crc kubenswrapper[4744]: I0930 02:56:30.502842 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:30 crc kubenswrapper[4744]: I0930 02:56:30.502856 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:30 crc kubenswrapper[4744]: I0930 02:56:30.502880 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:30 crc kubenswrapper[4744]: E0930 02:56:30.503036 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:30 crc kubenswrapper[4744]: E0930 02:56:30.503172 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:30 crc kubenswrapper[4744]: I0930 02:56:30.503186 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:30 crc kubenswrapper[4744]: E0930 02:56:30.503253 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:30 crc kubenswrapper[4744]: E0930 02:56:30.503609 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:32 crc kubenswrapper[4744]: I0930 02:56:32.503270 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:32 crc kubenswrapper[4744]: I0930 02:56:32.503465 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:32 crc kubenswrapper[4744]: I0930 02:56:32.503469 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:32 crc kubenswrapper[4744]: I0930 02:56:32.503639 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:32 crc kubenswrapper[4744]: E0930 02:56:32.503635 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:32 crc kubenswrapper[4744]: E0930 02:56:32.503702 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:32 crc kubenswrapper[4744]: E0930 02:56:32.503787 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:32 crc kubenswrapper[4744]: E0930 02:56:32.503896 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:33 crc kubenswrapper[4744]: E0930 02:56:33.487235 4744 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 02:56:33 crc kubenswrapper[4744]: E0930 02:56:33.601297 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.216921 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/1.log" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.217755 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/0.log" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.217803 4744 generic.go:334] "Generic (PLEG): container finished" podID="6561e3c6-a8d1-4dc8-8bd3-09f042393658" containerID="cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea" exitCode=1 Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.217838 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerDied","Data":"cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea"} Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.217878 4744 scope.go:117] "RemoveContainer" containerID="d4be2102bc4f447f6090fd81afe6af8960d9d93d6f25eeadee68153f4f3afad8" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.218425 4744 scope.go:117] "RemoveContainer" containerID="cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea" Sep 30 02:56:34 crc kubenswrapper[4744]: E0930 02:56:34.218764 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nxppc_openshift-multus(6561e3c6-a8d1-4dc8-8bd3-09f042393658)\"" pod="openshift-multus/multus-nxppc" podUID="6561e3c6-a8d1-4dc8-8bd3-09f042393658" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.241864 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ml2rs" podStartSLOduration=96.241820382 podStartE2EDuration="1m36.241820382s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:23.19475815 +0000 UTC m=+110.367978144" watchObservedRunningTime="2025-09-30 02:56:34.241820382 +0000 UTC m=+121.415040376" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.503094 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.503125 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.503114 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:34 crc kubenswrapper[4744]: E0930 02:56:34.503250 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:34 crc kubenswrapper[4744]: E0930 02:56:34.503359 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:34 crc kubenswrapper[4744]: E0930 02:56:34.503437 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:34 crc kubenswrapper[4744]: I0930 02:56:34.503472 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:34 crc kubenswrapper[4744]: E0930 02:56:34.503676 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:35 crc kubenswrapper[4744]: I0930 02:56:35.224327 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/1.log" Sep 30 02:56:36 crc kubenswrapper[4744]: I0930 02:56:36.503456 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:36 crc kubenswrapper[4744]: E0930 02:56:36.503647 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:36 crc kubenswrapper[4744]: I0930 02:56:36.503459 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:36 crc kubenswrapper[4744]: E0930 02:56:36.503724 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:36 crc kubenswrapper[4744]: I0930 02:56:36.503456 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:36 crc kubenswrapper[4744]: E0930 02:56:36.503773 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:36 crc kubenswrapper[4744]: I0930 02:56:36.503800 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:36 crc kubenswrapper[4744]: E0930 02:56:36.503844 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:38 crc kubenswrapper[4744]: I0930 02:56:38.502488 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:38 crc kubenswrapper[4744]: E0930 02:56:38.502647 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:38 crc kubenswrapper[4744]: I0930 02:56:38.502835 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:38 crc kubenswrapper[4744]: I0930 02:56:38.502862 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:38 crc kubenswrapper[4744]: E0930 02:56:38.502911 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:38 crc kubenswrapper[4744]: E0930 02:56:38.503026 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:38 crc kubenswrapper[4744]: I0930 02:56:38.503565 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:38 crc kubenswrapper[4744]: E0930 02:56:38.503768 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:38 crc kubenswrapper[4744]: E0930 02:56:38.603467 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 02:56:40 crc kubenswrapper[4744]: I0930 02:56:40.503014 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:40 crc kubenswrapper[4744]: I0930 02:56:40.503082 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:40 crc kubenswrapper[4744]: I0930 02:56:40.503155 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:40 crc kubenswrapper[4744]: I0930 02:56:40.503028 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:40 crc kubenswrapper[4744]: E0930 02:56:40.503334 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:40 crc kubenswrapper[4744]: E0930 02:56:40.503538 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:40 crc kubenswrapper[4744]: E0930 02:56:40.503743 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:40 crc kubenswrapper[4744]: E0930 02:56:40.503875 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:41 crc kubenswrapper[4744]: I0930 02:56:41.504457 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 02:56:41 crc kubenswrapper[4744]: E0930 02:56:41.504664 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c5kw2_openshift-ovn-kubernetes(0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" Sep 30 02:56:42 crc kubenswrapper[4744]: I0930 02:56:42.503396 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:42 crc kubenswrapper[4744]: E0930 02:56:42.504484 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:42 crc kubenswrapper[4744]: I0930 02:56:42.503602 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:42 crc kubenswrapper[4744]: E0930 02:56:42.505294 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:42 crc kubenswrapper[4744]: I0930 02:56:42.503540 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:42 crc kubenswrapper[4744]: E0930 02:56:42.505574 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:42 crc kubenswrapper[4744]: I0930 02:56:42.503620 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:42 crc kubenswrapper[4744]: E0930 02:56:42.505916 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:43 crc kubenswrapper[4744]: E0930 02:56:43.604244 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 02:56:44 crc kubenswrapper[4744]: I0930 02:56:44.503415 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:44 crc kubenswrapper[4744]: I0930 02:56:44.503460 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:44 crc kubenswrapper[4744]: I0930 02:56:44.503465 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:44 crc kubenswrapper[4744]: I0930 02:56:44.503427 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:44 crc kubenswrapper[4744]: E0930 02:56:44.503589 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:44 crc kubenswrapper[4744]: E0930 02:56:44.503667 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:44 crc kubenswrapper[4744]: E0930 02:56:44.503733 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:44 crc kubenswrapper[4744]: E0930 02:56:44.503804 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:46 crc kubenswrapper[4744]: I0930 02:56:46.502824 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:46 crc kubenswrapper[4744]: I0930 02:56:46.502872 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:46 crc kubenswrapper[4744]: I0930 02:56:46.502844 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:46 crc kubenswrapper[4744]: I0930 02:56:46.502837 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:46 crc kubenswrapper[4744]: E0930 02:56:46.503077 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:46 crc kubenswrapper[4744]: E0930 02:56:46.503222 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:46 crc kubenswrapper[4744]: E0930 02:56:46.503338 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:46 crc kubenswrapper[4744]: E0930 02:56:46.503502 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:48 crc kubenswrapper[4744]: I0930 02:56:48.503601 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:48 crc kubenswrapper[4744]: I0930 02:56:48.503612 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:48 crc kubenswrapper[4744]: I0930 02:56:48.503612 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:48 crc kubenswrapper[4744]: E0930 02:56:48.503958 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:48 crc kubenswrapper[4744]: E0930 02:56:48.504136 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:48 crc kubenswrapper[4744]: E0930 02:56:48.504257 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:48 crc kubenswrapper[4744]: I0930 02:56:48.504661 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:48 crc kubenswrapper[4744]: E0930 02:56:48.504823 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:48 crc kubenswrapper[4744]: I0930 02:56:48.504916 4744 scope.go:117] "RemoveContainer" containerID="cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea" Sep 30 02:56:48 crc kubenswrapper[4744]: E0930 02:56:48.606609 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 02:56:49 crc kubenswrapper[4744]: I0930 02:56:49.277109 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/1.log" Sep 30 02:56:49 crc kubenswrapper[4744]: I0930 02:56:49.277175 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerStarted","Data":"87fcda1a58aa577149c9ec3a622519ff80ed9a5e10e797f9632a1ac862b78ced"} Sep 30 02:56:50 crc kubenswrapper[4744]: I0930 02:56:50.503339 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:50 crc kubenswrapper[4744]: I0930 02:56:50.503355 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:50 crc kubenswrapper[4744]: I0930 02:56:50.503503 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:50 crc kubenswrapper[4744]: E0930 02:56:50.503681 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:50 crc kubenswrapper[4744]: I0930 02:56:50.503748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:50 crc kubenswrapper[4744]: E0930 02:56:50.503825 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:50 crc kubenswrapper[4744]: E0930 02:56:50.503888 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:50 crc kubenswrapper[4744]: E0930 02:56:50.504315 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:52 crc kubenswrapper[4744]: I0930 02:56:52.503564 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:52 crc kubenswrapper[4744]: I0930 02:56:52.503564 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:52 crc kubenswrapper[4744]: I0930 02:56:52.503583 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:52 crc kubenswrapper[4744]: I0930 02:56:52.503676 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:52 crc kubenswrapper[4744]: E0930 02:56:52.504050 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:52 crc kubenswrapper[4744]: E0930 02:56:52.504221 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:52 crc kubenswrapper[4744]: E0930 02:56:52.504478 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:52 crc kubenswrapper[4744]: E0930 02:56:52.504617 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:53 crc kubenswrapper[4744]: E0930 02:56:53.607737 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 02:56:54 crc kubenswrapper[4744]: I0930 02:56:54.503157 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:54 crc kubenswrapper[4744]: I0930 02:56:54.503243 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:54 crc kubenswrapper[4744]: I0930 02:56:54.503283 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:54 crc kubenswrapper[4744]: I0930 02:56:54.503187 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:54 crc kubenswrapper[4744]: E0930 02:56:54.503400 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:54 crc kubenswrapper[4744]: E0930 02:56:54.503509 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:54 crc kubenswrapper[4744]: E0930 02:56:54.503661 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:54 crc kubenswrapper[4744]: E0930 02:56:54.503744 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:56 crc kubenswrapper[4744]: I0930 02:56:56.502680 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:56 crc kubenswrapper[4744]: E0930 02:56:56.502872 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:56 crc kubenswrapper[4744]: I0930 02:56:56.502869 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:56 crc kubenswrapper[4744]: I0930 02:56:56.502921 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:56 crc kubenswrapper[4744]: I0930 02:56:56.503514 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:56 crc kubenswrapper[4744]: E0930 02:56:56.503721 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:56 crc kubenswrapper[4744]: E0930 02:56:56.503796 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:56 crc kubenswrapper[4744]: E0930 02:56:56.503894 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:56 crc kubenswrapper[4744]: I0930 02:56:56.504242 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 02:56:57 crc kubenswrapper[4744]: I0930 02:56:57.307993 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/3.log" Sep 30 02:56:57 crc kubenswrapper[4744]: I0930 02:56:57.311237 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerStarted","Data":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} Sep 30 02:56:57 crc kubenswrapper[4744]: I0930 02:56:57.311932 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:56:57 crc kubenswrapper[4744]: I0930 02:56:57.344956 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podStartSLOduration=119.344930627 podStartE2EDuration="1m59.344930627s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:56:57.343212135 +0000 UTC m=+144.516432119" watchObservedRunningTime="2025-09-30 02:56:57.344930627 +0000 UTC m=+144.518150601" Sep 30 02:56:57 crc kubenswrapper[4744]: I0930 02:56:57.511736 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zd85c"] Sep 30 02:56:57 crc kubenswrapper[4744]: I0930 02:56:57.511878 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:57 crc kubenswrapper[4744]: E0930 02:56:57.512024 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:56:58 crc kubenswrapper[4744]: I0930 02:56:58.503571 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:56:58 crc kubenswrapper[4744]: I0930 02:56:58.503663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:56:58 crc kubenswrapper[4744]: E0930 02:56:58.503752 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:56:58 crc kubenswrapper[4744]: I0930 02:56:58.503672 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:56:58 crc kubenswrapper[4744]: E0930 02:56:58.503886 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:56:58 crc kubenswrapper[4744]: E0930 02:56:58.504003 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:56:58 crc kubenswrapper[4744]: E0930 02:56:58.609121 4744 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 02:56:59 crc kubenswrapper[4744]: I0930 02:56:59.502843 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:56:59 crc kubenswrapper[4744]: E0930 02:56:59.503031 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.502967 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.503032 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.503032 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.503205 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.503606 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.503783 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.585683 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.585899 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:59:02.585861361 +0000 UTC m=+269.759081365 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.585979 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.586134 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.586224 4744 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.586307 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:59:02.586291945 +0000 UTC m=+269.759511959 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.586389 4744 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.586508 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 02:59:02.58646958 +0000 UTC m=+269.759689584 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.686725 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:57:00 crc kubenswrapper[4744]: I0930 02:57:00.687316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687057 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687476 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687504 4744 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687601 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687632 4744 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687659 4744 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687633 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 02:59:02.687598574 +0000 UTC m=+269.860818588 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:57:00 crc kubenswrapper[4744]: E0930 02:57:00.687763 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 02:59:02.687725358 +0000 UTC m=+269.860945372 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 02:57:01 crc kubenswrapper[4744]: I0930 02:57:01.503059 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:57:01 crc kubenswrapper[4744]: E0930 02:57:01.503328 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:57:02 crc kubenswrapper[4744]: I0930 02:57:02.503246 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:57:02 crc kubenswrapper[4744]: I0930 02:57:02.503262 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:57:02 crc kubenswrapper[4744]: E0930 02:57:02.503584 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 02:57:02 crc kubenswrapper[4744]: I0930 02:57:02.503637 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:57:02 crc kubenswrapper[4744]: E0930 02:57:02.503892 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 02:57:02 crc kubenswrapper[4744]: E0930 02:57:02.503986 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 02:57:03 crc kubenswrapper[4744]: I0930 02:57:03.503050 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:57:03 crc kubenswrapper[4744]: E0930 02:57:03.505643 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zd85c" podUID="d91f1289-b199-4e91-9bbd-78ec9a433706" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.348273 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.348443 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.503139 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.503246 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.503246 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.507819 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.507820 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.508584 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 02:57:04 crc kubenswrapper[4744]: I0930 02:57:04.511183 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 02:57:05 crc kubenswrapper[4744]: I0930 02:57:05.502897 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:57:05 crc kubenswrapper[4744]: I0930 02:57:05.506942 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 02:57:05 crc kubenswrapper[4744]: I0930 02:57:05.508618 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.165309 4744 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.241597 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tp75b"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.242644 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.243635 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jxxmx"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.244845 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.252040 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.252549 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.253051 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.253418 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.253798 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.254526 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.254673 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.254690 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.255080 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.255339 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.255516 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.257328 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.259496 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.260778 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.261331 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.262687 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.270262 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.270440 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.270542 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.270622 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.270710 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.271045 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.271226 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wqfz8"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.272005 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.273021 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bpp4m"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.273995 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.274547 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bpp4m" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.274671 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.275027 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.275153 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.275915 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.276301 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.276694 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.277164 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.285704 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7d4sv"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.286510 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.286848 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jlll\" (UniqueName: \"kubernetes.io/projected/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-kube-api-access-4jlll\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-config\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287054 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-serving-cert\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287080 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpdt2\" (UniqueName: \"kubernetes.io/projected/138228d2-ee95-4ded-a68c-a7cedecbe375-kube-api-access-dpdt2\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287112 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-config\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287143 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287170 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/138228d2-ee95-4ded-a68c-a7cedecbe375-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287196 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnnwr\" (UniqueName: \"kubernetes.io/projected/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-kube-api-access-qnnwr\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287215 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-images\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287246 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287268 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-serving-cert\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287289 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-encryption-config\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287317 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-etcd-client\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287337 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138228d2-ee95-4ded-a68c-a7cedecbe375-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287361 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-config\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287415 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfr42\" (UniqueName: \"kubernetes.io/projected/35f384ac-77c6-40c9-a5b6-9aef71930745-kube-api-access-dfr42\") pod \"cluster-samples-operator-665b6dd947-xdncf\" (UID: \"35f384ac-77c6-40c9-a5b6-9aef71930745\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287434 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-audit-dir\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287456 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-trusted-ca\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287477 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8gmj\" (UniqueName: \"kubernetes.io/projected/a50ca402-327c-41ea-832c-15ad7932d8f5-kube-api-access-f8gmj\") pod \"downloads-7954f5f757-bpp4m\" (UID: \"a50ca402-327c-41ea-832c-15ad7932d8f5\") " pod="openshift-console/downloads-7954f5f757-bpp4m" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287497 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28bgk\" (UniqueName: \"kubernetes.io/projected/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-kube-api-access-28bgk\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287518 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-client-ca\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287540 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/35f384ac-77c6-40c9-a5b6-9aef71930745-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdncf\" (UID: \"35f384ac-77c6-40c9-a5b6-9aef71930745\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287564 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138228d2-ee95-4ded-a68c-a7cedecbe375-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287588 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb7gd\" (UniqueName: \"kubernetes.io/projected/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-kube-api-access-zb7gd\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287617 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287645 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-audit-policies\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287670 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-serving-cert\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.287720 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.290481 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5rwhq"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.291163 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.291922 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.292628 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.298961 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.299462 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.300235 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xxnzf"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.301107 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.301591 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.301780 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.301899 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.302340 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.302613 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.302833 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303006 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303071 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303148 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303158 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303276 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303006 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303551 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303684 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303812 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.303956 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.304057 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.310096 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7gvsx"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.310605 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.310896 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.311617 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.329141 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.329898 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q47dv"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.330685 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.330829 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.331810 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.339561 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.341234 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.361527 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.361878 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pzp4p"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362055 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362467 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362489 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362517 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362190 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362606 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362594 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362715 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362749 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362854 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362926 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362941 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.362991 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363065 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363122 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363318 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363343 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363461 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363648 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363733 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363898 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.363920 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364064 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364242 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364384 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364434 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364640 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364743 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364868 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364906 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.364988 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.365066 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.365153 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.365239 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.365322 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.365445 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.365622 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.366309 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.366425 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.366940 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.367855 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wbwqr"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.368290 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r5jzt"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.368564 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.368675 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.368561 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.369057 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.369188 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.369333 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.371825 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.372670 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.372855 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.374470 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.375049 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.377620 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.378886 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.381220 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.381898 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.382241 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fpq6t"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.382660 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.390316 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.390803 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391155 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-images\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391201 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnnwr\" (UniqueName: \"kubernetes.io/projected/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-kube-api-access-qnnwr\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391236 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391264 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-serving-cert\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391284 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-encryption-config\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391301 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138228d2-ee95-4ded-a68c-a7cedecbe375-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391325 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-etcd-client\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-config\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391363 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfr42\" (UniqueName: \"kubernetes.io/projected/35f384ac-77c6-40c9-a5b6-9aef71930745-kube-api-access-dfr42\") pod \"cluster-samples-operator-665b6dd947-xdncf\" (UID: \"35f384ac-77c6-40c9-a5b6-9aef71930745\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-audit-dir\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391424 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8gmj\" (UniqueName: \"kubernetes.io/projected/a50ca402-327c-41ea-832c-15ad7932d8f5-kube-api-access-f8gmj\") pod \"downloads-7954f5f757-bpp4m\" (UID: \"a50ca402-327c-41ea-832c-15ad7932d8f5\") " pod="openshift-console/downloads-7954f5f757-bpp4m" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391439 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28bgk\" (UniqueName: \"kubernetes.io/projected/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-kube-api-access-28bgk\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391455 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-trusted-ca\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391472 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-client-ca\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391492 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138228d2-ee95-4ded-a68c-a7cedecbe375-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391509 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/35f384ac-77c6-40c9-a5b6-9aef71930745-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdncf\" (UID: \"35f384ac-77c6-40c9-a5b6-9aef71930745\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391527 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb7gd\" (UniqueName: \"kubernetes.io/projected/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-kube-api-access-zb7gd\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391567 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-audit-policies\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391593 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-serving-cert\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391618 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391639 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jlll\" (UniqueName: \"kubernetes.io/projected/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-kube-api-access-4jlll\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391663 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-config\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391679 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-serving-cert\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391697 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpdt2\" (UniqueName: \"kubernetes.io/projected/138228d2-ee95-4ded-a68c-a7cedecbe375-kube-api-access-dpdt2\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-config\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391734 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.391754 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/138228d2-ee95-4ded-a68c-a7cedecbe375-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.392138 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.392359 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.393012 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-images\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.407301 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/138228d2-ee95-4ded-a68c-a7cedecbe375-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.411799 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-audit-dir\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.415203 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.416048 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-encryption-config\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.418632 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-trusted-ca\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.419301 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-client-ca\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.419803 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-serving-cert\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.421621 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/138228d2-ee95-4ded-a68c-a7cedecbe375-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.423806 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.424547 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.424810 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.426537 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-config\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.427239 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-config\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.427398 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-config\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.427597 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.429210 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-audit-policies\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.444784 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.445227 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.445304 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.445581 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.448126 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.448217 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.448571 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-serving-cert\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.448747 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.448940 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.449078 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.449268 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.449389 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-etcd-client\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.434167 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/35f384ac-77c6-40c9-a5b6-9aef71930745-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdncf\" (UID: \"35f384ac-77c6-40c9-a5b6-9aef71930745\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.449753 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-serving-cert\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.450711 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.449458 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.451156 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.452272 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.454428 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.454677 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.456359 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.456703 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.459333 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.459836 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.460692 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.461103 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.461441 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.461742 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5rlq"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.463381 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.463790 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jxxmx"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.463810 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.463849 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.464128 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.465566 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lq7lq"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.465887 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.466170 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.466172 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-txgv6"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.466965 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.468651 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.469172 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.469227 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.470046 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.470972 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.471061 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.471539 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.472042 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.472508 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.473423 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.475496 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tp75b"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.475577 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wqfz8"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.476457 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bpp4m"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.479397 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.479957 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gpn94"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.480607 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.480963 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.482825 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.483846 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.484798 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.486400 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.486425 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7d4sv"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.488402 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.488497 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pzp4p"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.491874 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q47dv"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.492810 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wbwqr"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.501047 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.510781 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.511482 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xxnzf"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.513670 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.513694 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5rwhq"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.513707 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.513720 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.513731 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zpfzk"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.514532 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-nhkgz"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.514603 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.515384 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.515624 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-txgv6"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.516606 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.517631 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.518675 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fpq6t"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.519744 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lq7lq"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.521170 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.522192 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.523633 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gpn94"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.524318 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.525395 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.526406 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zpfzk"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.527400 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.527804 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.528821 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5rlq"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.529862 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.530910 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7gvsx"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.531984 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.533337 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.534862 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jd84s"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.536127 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jd84s"] Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.536223 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.545421 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.565131 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.585265 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.604406 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.625755 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.645686 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.664715 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.685835 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.705475 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.725327 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.746552 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.765041 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.791827 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.805936 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.825734 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.845823 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.866432 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.885864 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.906250 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.925884 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 02:57:13 crc kubenswrapper[4744]: I0930 02:57:13.945538 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.014702 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnnwr\" (UniqueName: \"kubernetes.io/projected/e771fd9b-4d78-4117-ac7c-40595fa5eb0b-kube-api-access-qnnwr\") pod \"machine-api-operator-5694c8668f-jxxmx\" (UID: \"e771fd9b-4d78-4117-ac7c-40595fa5eb0b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.033697 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfr42\" (UniqueName: \"kubernetes.io/projected/35f384ac-77c6-40c9-a5b6-9aef71930745-kube-api-access-dfr42\") pod \"cluster-samples-operator-665b6dd947-xdncf\" (UID: \"35f384ac-77c6-40c9-a5b6-9aef71930745\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.057959 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8gmj\" (UniqueName: \"kubernetes.io/projected/a50ca402-327c-41ea-832c-15ad7932d8f5-kube-api-access-f8gmj\") pod \"downloads-7954f5f757-bpp4m\" (UID: \"a50ca402-327c-41ea-832c-15ad7932d8f5\") " pod="openshift-console/downloads-7954f5f757-bpp4m" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.068728 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bpp4m" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.080980 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28bgk\" (UniqueName: \"kubernetes.io/projected/ed0e7b5c-f54d-4ab7-853d-72d34eed714d-kube-api-access-28bgk\") pod \"apiserver-7bbb656c7d-lx8qb\" (UID: \"ed0e7b5c-f54d-4ab7-853d-72d34eed714d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.106722 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.118641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb7gd\" (UniqueName: \"kubernetes.io/projected/c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7-kube-api-access-zb7gd\") pod \"console-operator-58897d9998-wqfz8\" (UID: \"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7\") " pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.126350 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.146807 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.167428 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.216830 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jlll\" (UniqueName: \"kubernetes.io/projected/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-kube-api-access-4jlll\") pod \"controller-manager-879f6c89f-tp75b\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.236779 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpdt2\" (UniqueName: \"kubernetes.io/projected/138228d2-ee95-4ded-a68c-a7cedecbe375-kube-api-access-dpdt2\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.236952 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.246129 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.267073 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.269541 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/138228d2-ee95-4ded-a68c-a7cedecbe375-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cb2hl\" (UID: \"138228d2-ee95-4ded-a68c-a7cedecbe375\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.286786 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.290877 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.294252 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.307122 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.325758 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.348177 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.363519 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.367224 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.377750 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.385780 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.414122 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.416890 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bpp4m"] Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.427595 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.446077 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.463063 4744 request.go:700] Waited for 1.001842563s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.463499 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bpp4m" event={"ID":"a50ca402-327c-41ea-832c-15ad7932d8f5","Type":"ContainerStarted","Data":"39d9264c1e81b2dcd92eed75d59b022764eafc74ff1da5bcdc6f52c72b6faf6a"} Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.465416 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.490549 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.499204 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.503798 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jxxmx"] Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.506291 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.525882 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.545360 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.562726 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf"] Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.574849 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.585322 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.605706 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.626231 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.646176 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.665293 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.686249 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.704558 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.705228 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl"] Sep 30 02:57:14 crc kubenswrapper[4744]: W0930 02:57:14.717513 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod138228d2_ee95_4ded_a68c_a7cedecbe375.slice/crio-31582775cf277ed5fd595abe4837268967c63da7c863f587d05f2f984674e64e WatchSource:0}: Error finding container 31582775cf277ed5fd595abe4837268967c63da7c863f587d05f2f984674e64e: Status 404 returned error can't find the container with id 31582775cf277ed5fd595abe4837268967c63da7c863f587d05f2f984674e64e Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.725043 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.745162 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.764715 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.774931 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tp75b"] Sep 30 02:57:14 crc kubenswrapper[4744]: W0930 02:57:14.781322 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7426d8_1f6e_4dfc_b3b4_daf337153ce9.slice/crio-87c6a8d567736a15119d5cf41bf7cb70a2c9bb8185834ebc2b3773bf9f4b9339 WatchSource:0}: Error finding container 87c6a8d567736a15119d5cf41bf7cb70a2c9bb8185834ebc2b3773bf9f4b9339: Status 404 returned error can't find the container with id 87c6a8d567736a15119d5cf41bf7cb70a2c9bb8185834ebc2b3773bf9f4b9339 Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.784774 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.805435 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.826123 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.845744 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.866798 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.885506 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.885824 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb"] Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.885869 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wqfz8"] Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.904453 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.927658 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.945012 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.964526 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 02:57:14 crc kubenswrapper[4744]: I0930 02:57:14.986583 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.007001 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.037219 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.047533 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.066122 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.086285 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.106008 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.126101 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.145543 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.167623 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.184676 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.207803 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.224678 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.244250 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.266050 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.287431 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.310415 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.325612 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.351125 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.365706 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.385555 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.405094 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.425288 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.445321 4744 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.463959 4744 request.go:700] Waited for 1.927479896s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.466184 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.469884 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" event={"ID":"35f384ac-77c6-40c9-a5b6-9aef71930745","Type":"ContainerStarted","Data":"56361b10aedb9a17f42b4328237b877f231e5f039194d111cb4d45d41e08d7e9"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.469942 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" event={"ID":"35f384ac-77c6-40c9-a5b6-9aef71930745","Type":"ContainerStarted","Data":"0f3c47ed3e052cce6f9a8032245228fd8b5ce5426c6cb63da6e0954133882b62"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.469960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" event={"ID":"35f384ac-77c6-40c9-a5b6-9aef71930745","Type":"ContainerStarted","Data":"064bd58b9eddeb751d040ea19bb12dcb71592b34671e6ad571c1353c37f9b3db"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.473029 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" event={"ID":"e771fd9b-4d78-4117-ac7c-40595fa5eb0b","Type":"ContainerStarted","Data":"af6947ae55ca9b588087ad314602f8ca611eded4f2cd258ce7832e2ce46c58fb"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.473066 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" event={"ID":"e771fd9b-4d78-4117-ac7c-40595fa5eb0b","Type":"ContainerStarted","Data":"431a1ae0305eb7bb31423122bd2ffaa9be74e302a657b3821c74e121e93213af"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.473081 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" event={"ID":"e771fd9b-4d78-4117-ac7c-40595fa5eb0b","Type":"ContainerStarted","Data":"4b06684ea034a16b2b5311f4fc4b5fb962970447e7e405ec1b6db10046006081"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.475129 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" event={"ID":"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9","Type":"ContainerStarted","Data":"5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.475174 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" event={"ID":"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9","Type":"ContainerStarted","Data":"87c6a8d567736a15119d5cf41bf7cb70a2c9bb8185834ebc2b3773bf9f4b9339"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.476313 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.478725 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" event={"ID":"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7","Type":"ContainerStarted","Data":"e46950833e95bdcafd4829c55fbd305dffa41a9f18fbccdcd27130f5c1d34b31"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.478769 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" event={"ID":"c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7","Type":"ContainerStarted","Data":"71aa1e218f9e6ae4253fbb54a7dd3aab6ad0915b0fae24187098b5b653fe3da2"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.479216 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.479644 4744 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tp75b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.479716 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" podUID="8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.482073 4744 patch_prober.go:28] interesting pod/console-operator-58897d9998-wqfz8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.482107 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" podUID="c37bda6b-0d1f-4ae4-86bf-33c01a07c0e7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.483142 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bpp4m" event={"ID":"a50ca402-327c-41ea-832c-15ad7932d8f5","Type":"ContainerStarted","Data":"0f7cc7a3b3a6b4735fcf8875905e2e4aef6ef0e20db7fbd58fd91729a404d013"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.483440 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bpp4m" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.484973 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" event={"ID":"138228d2-ee95-4ded-a68c-a7cedecbe375","Type":"ContainerStarted","Data":"554625e7f47672d63677d7ea709e08ce6cc35220a9de0f907aa6832eb30c843d"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.485044 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" event={"ID":"138228d2-ee95-4ded-a68c-a7cedecbe375","Type":"ContainerStarted","Data":"31582775cf277ed5fd595abe4837268967c63da7c863f587d05f2f984674e64e"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.486644 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-bpp4m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.486695 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bpp4m" podUID="a50ca402-327c-41ea-832c-15ad7932d8f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.487453 4744 generic.go:334] "Generic (PLEG): container finished" podID="ed0e7b5c-f54d-4ab7-853d-72d34eed714d" containerID="ddeff51036475416b981617ac684b08aaaa915c6b8c8b536a4937b4ef1a1c4c2" exitCode=0 Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.487523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" event={"ID":"ed0e7b5c-f54d-4ab7-853d-72d34eed714d","Type":"ContainerDied","Data":"ddeff51036475416b981617ac684b08aaaa915c6b8c8b536a4937b4ef1a1c4c2"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.487581 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" event={"ID":"ed0e7b5c-f54d-4ab7-853d-72d34eed714d","Type":"ContainerStarted","Data":"a16be865adb9be5bf827866c28bfa3a3be1acfa197518a7436cf930bd0cf9efa"} Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.523976 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrlf\" (UniqueName: \"kubernetes.io/projected/63ea8335-da26-4a4d-b35e-87870d3d61b1-kube-api-access-zqrlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.524032 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a46c375-83ec-4de9-8047-3abde5224588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.524087 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ba3df4a-66a1-47bc-924b-542c7ca89389-metrics-tls\") pod \"dns-operator-744455d44c-pzp4p\" (UID: \"1ba3df4a-66a1-47bc-924b-542c7ca89389\") " pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.524242 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0918c73-ef87-42c8-8395-9499c5a91e2b-serving-cert\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.525446 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-bound-sa-token\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.525486 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-encryption-config\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.525557 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ea8335-da26-4a4d-b35e-87870d3d61b1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.525601 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/607782d2-50af-4b1e-a3fe-603ad6267bc9-auth-proxy-config\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.526325 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l46mg\" (UniqueName: \"kubernetes.io/projected/e0918c73-ef87-42c8-8395-9499c5a91e2b-kube-api-access-l46mg\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.526358 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527286 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef71aa6-910d-4a67-bef9-2e37d689408b-serving-cert\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527322 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dk4s\" (UniqueName: \"kubernetes.io/projected/eeb7ab47-53ea-434c-8367-ad667abe4168-kube-api-access-7dk4s\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527348 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a46c375-83ec-4de9-8047-3abde5224588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527416 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527473 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527510 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527539 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-image-import-ca\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-audit-policies\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527666 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stpjx\" (UniqueName: \"kubernetes.io/projected/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-kube-api-access-stpjx\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527703 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527728 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-client-ca\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527803 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527831 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527871 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527900 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-oauth-config\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527932 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63ddf643-780d-438a-bf7b-bf73096c9902-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wbwqr\" (UID: \"63ddf643-780d-438a-bf7b-bf73096c9902\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.527959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528137 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnxr6\" (UniqueName: \"kubernetes.io/projected/8ef71aa6-910d-4a67-bef9-2e37d689408b-kube-api-access-dnxr6\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528181 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-client\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528229 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-serving-cert\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528254 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607782d2-50af-4b1e-a3fe-603ad6267bc9-config\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528283 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528310 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-oauth-serving-cert\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528335 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pf7\" (UniqueName: \"kubernetes.io/projected/dd70937c-9e84-468b-b81f-b9f400436aec-kube-api-access-67pf7\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528363 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-etcd-client\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528463 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528503 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-service-ca\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528558 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-serving-cert\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528591 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528666 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-trusted-ca\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-audit\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63ea8335-da26-4a4d-b35e-87870d3d61b1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528746 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blmrn\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-kube-api-access-blmrn\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528767 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-config\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528788 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8z5v\" (UniqueName: \"kubernetes.io/projected/607782d2-50af-4b1e-a3fe-603ad6267bc9-kube-api-access-d8z5v\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528808 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-serving-cert\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528829 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-trusted-ca-bundle\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528852 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb7ab47-53ea-434c-8367-ad667abe4168-serving-cert\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528873 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-config\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528898 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-config\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlg4s\" (UniqueName: \"kubernetes.io/projected/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-kube-api-access-tlg4s\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.528957 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8df09393-7557-4bf8-8cbf-e2aa59df04b6-node-pullsecrets\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529009 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-service-ca\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529062 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529096 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9xs\" (UniqueName: \"kubernetes.io/projected/1ba3df4a-66a1-47bc-924b-542c7ca89389-kube-api-access-rb9xs\") pod \"dns-operator-744455d44c-pzp4p\" (UID: \"1ba3df4a-66a1-47bc-924b-542c7ca89389\") " pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529121 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-service-ca-bundle\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529150 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529223 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-certificates\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529256 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8df09393-7557-4bf8-8cbf-e2aa59df04b6-audit-dir\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529282 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n26fr\" (UniqueName: \"kubernetes.io/projected/8df09393-7557-4bf8-8cbf-e2aa59df04b6-kube-api-access-n26fr\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529303 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529324 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-console-config\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529345 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-config\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529390 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529414 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529434 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16046530-d8fe-40bb-9a22-2a021648faa9-audit-dir\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529456 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529501 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a46c375-83ec-4de9-8047-3abde5224588-config\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529555 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-tls\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529578 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-default-certificate\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529627 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-metrics-certs\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-ca\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/607782d2-50af-4b1e-a3fe-603ad6267bc9-machine-approver-tls\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529716 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42f51b8-542f-4784-88cd-89832dfc1999-service-ca-bundle\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529737 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8w6s\" (UniqueName: \"kubernetes.io/projected/e42f51b8-542f-4784-88cd-89832dfc1999-kube-api-access-h8w6s\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529765 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529796 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmb6m\" (UniqueName: \"kubernetes.io/projected/63ddf643-780d-438a-bf7b-bf73096c9902-kube-api-access-wmb6m\") pod \"multus-admission-controller-857f4d67dd-wbwqr\" (UID: \"63ddf643-780d-438a-bf7b-bf73096c9902\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529820 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-stats-auth\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.529842 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkk9\" (UniqueName: \"kubernetes.io/projected/16046530-d8fe-40bb-9a22-2a021648faa9-kube-api-access-2qkk9\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: E0930 02:57:15.535353 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.035333371 +0000 UTC m=+163.208553355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630505 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:15 crc kubenswrapper[4744]: E0930 02:57:15.630651 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.130616425 +0000 UTC m=+163.303836399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630786 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-proxy-tls\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630813 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630838 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef71aa6-910d-4a67-bef9-2e37d689408b-serving-cert\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630863 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dk4s\" (UniqueName: \"kubernetes.io/projected/eeb7ab47-53ea-434c-8367-ad667abe4168-kube-api-access-7dk4s\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630892 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpxl\" (UniqueName: \"kubernetes.io/projected/b2676764-efb6-4e02-9012-74b8675e7bff-kube-api-access-nnpxl\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.630944 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa8552c2-312f-40d7-abdb-160f831e5c04-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631036 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-images\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631097 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8c390261-32ad-4d03-82b2-261cbafe52f9-node-bootstrap-token\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631120 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631317 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-oauth-config\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631441 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8c390261-32ad-4d03-82b2-261cbafe52f9-certs\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631487 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2676764-efb6-4e02-9012-74b8675e7bff-secret-volume\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631528 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxr6\" (UniqueName: \"kubernetes.io/projected/8ef71aa6-910d-4a67-bef9-2e37d689408b-kube-api-access-dnxr6\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631558 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607782d2-50af-4b1e-a3fe-603ad6267bc9-config\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631616 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-client\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631676 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-serving-cert\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.631743 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632192 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607782d2-50af-4b1e-a3fe-603ad6267bc9-config\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632791 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1add32c6-5ed4-415a-a8f3-0de2fb3f71d9-cert\") pod \"ingress-canary-gpn94\" (UID: \"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9\") " pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632833 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmfg\" (UniqueName: \"kubernetes.io/projected/d0c87777-c4d8-4783-93a0-67e2b680f770-kube-api-access-wqmfg\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632866 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-service-ca\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632895 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-socket-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68454d10-9f26-41d7-9b42-1ee60a78a809-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632967 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58sf\" (UniqueName: \"kubernetes.io/projected/97f8801e-bb23-4a2f-bd03-9711d966d3c7-kube-api-access-r58sf\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.632992 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-metrics-tls\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-trusted-ca\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2251154-99b6-4b82-ad16-19e36f3eaf8e-srv-cert\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633075 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633111 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-serving-cert\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633142 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-config\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633173 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8df09393-7557-4bf8-8cbf-e2aa59df04b6-node-pullsecrets\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633224 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-metrics-tls\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633251 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-service-ca\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633279 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c2251154-99b6-4b82-ad16-19e36f3eaf8e-kube-api-access-7jxsz\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633322 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633350 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfa0572-7577-49a6-9845-782e3ca7df2f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633391 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d6dw\" (UniqueName: \"kubernetes.io/projected/8c390261-32ad-4d03-82b2-261cbafe52f9-kube-api-access-8d6dw\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633422 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c87777-c4d8-4783-93a0-67e2b680f770-proxy-tls\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633464 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9xs\" (UniqueName: \"kubernetes.io/projected/1ba3df4a-66a1-47bc-924b-542c7ca89389-kube-api-access-rb9xs\") pod \"dns-operator-744455d44c-pzp4p\" (UID: \"1ba3df4a-66a1-47bc-924b-542c7ca89389\") " pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633494 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633524 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n26fr\" (UniqueName: \"kubernetes.io/projected/8df09393-7557-4bf8-8cbf-e2aa59df04b6-kube-api-access-n26fr\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633550 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfgsl\" (UniqueName: \"kubernetes.io/projected/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-kube-api-access-vfgsl\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68454d10-9f26-41d7-9b42-1ee60a78a809-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633590 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68454d10-9f26-41d7-9b42-1ee60a78a809-config\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633608 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfa0572-7577-49a6-9845-782e3ca7df2f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633640 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633662 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-console-config\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633683 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-config\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633742 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633767 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633791 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d9401991-d332-4a04-85be-b3d5a7b00c27-signing-cabundle\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633818 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633850 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d9401991-d332-4a04-85be-b3d5a7b00c27-signing-key\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633874 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qxc\" (UniqueName: \"kubernetes.io/projected/d9401991-d332-4a04-85be-b3d5a7b00c27-kube-api-access-t8qxc\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633900 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-tls\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633922 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-default-certificate\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633945 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/607782d2-50af-4b1e-a3fe-603ad6267bc9-machine-approver-tls\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633969 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmb6m\" (UniqueName: \"kubernetes.io/projected/63ddf643-780d-438a-bf7b-bf73096c9902-kube-api-access-wmb6m\") pod \"multus-admission-controller-857f4d67dd-wbwqr\" (UID: \"63ddf643-780d-438a-bf7b-bf73096c9902\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.633991 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8w6s\" (UniqueName: \"kubernetes.io/projected/e42f51b8-542f-4784-88cd-89832dfc1999-kube-api-access-h8w6s\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634016 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-stats-auth\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634062 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-service-ca\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkk9\" (UniqueName: \"kubernetes.io/projected/16046530-d8fe-40bb-9a22-2a021648faa9-kube-api-access-2qkk9\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634619 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-mountpoint-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634652 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97f8801e-bb23-4a2f-bd03-9711d966d3c7-tmpfs\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8df09393-7557-4bf8-8cbf-e2aa59df04b6-node-pullsecrets\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a46c375-83ec-4de9-8047-3abde5224588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634928 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ssdfc\" (UID: \"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-plugins-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.634993 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfdd\" (UniqueName: \"kubernetes.io/projected/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-kube-api-access-tmfdd\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635322 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ea8335-da26-4a4d-b35e-87870d3d61b1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635360 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l46mg\" (UniqueName: \"kubernetes.io/projected/e0918c73-ef87-42c8-8395-9499c5a91e2b-kube-api-access-l46mg\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635412 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a46c375-83ec-4de9-8047-3abde5224588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635440 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-trusted-ca\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635472 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9ksk\" (UniqueName: \"kubernetes.io/projected/aa8552c2-312f-40d7-abdb-160f831e5c04-kube-api-access-r9ksk\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635498 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635519 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-config-volume\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-audit-policies\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635569 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635591 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-image-import-ca\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635615 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f8801e-bb23-4a2f-bd03-9711d966d3c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635640 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stpjx\" (UniqueName: \"kubernetes.io/projected/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-kube-api-access-stpjx\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635668 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4tvc\" (UniqueName: \"kubernetes.io/projected/fd4c491e-16e3-4e31-a4a9-314d53ceada8-kube-api-access-c4tvc\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635693 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-562sj\" (UniqueName: \"kubernetes.io/projected/1add32c6-5ed4-415a-a8f3-0de2fb3f71d9-kube-api-access-562sj\") pod \"ingress-canary-gpn94\" (UID: \"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9\") " pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635720 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635745 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635767 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-client-ca\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635816 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63ddf643-780d-438a-bf7b-bf73096c9902-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wbwqr\" (UID: \"63ddf643-780d-438a-bf7b-bf73096c9902\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635842 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635871 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95rg6\" (UniqueName: \"kubernetes.io/projected/32a4e9f9-124b-47f6-821c-44714e635968-kube-api-access-95rg6\") pod \"package-server-manager-789f6589d5-n9lhl\" (UID: \"32a4e9f9-124b-47f6-821c-44714e635968\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635892 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2676764-efb6-4e02-9012-74b8675e7bff-config-volume\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635916 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bfa0572-7577-49a6-9845-782e3ca7df2f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635939 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5965n\" (UniqueName: \"kubernetes.io/projected/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-kube-api-access-5965n\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635962 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635989 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrlk8\" (UniqueName: \"kubernetes.io/projected/6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e-kube-api-access-nrlk8\") pod \"control-plane-machine-set-operator-78cbb6b69f-ssdfc\" (UID: \"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636001 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-trusted-ca\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636014 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-oauth-serving-cert\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636116 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pf7\" (UniqueName: \"kubernetes.io/projected/dd70937c-9e84-468b-b81f-b9f400436aec-kube-api-access-67pf7\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636176 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-etcd-client\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636204 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-registration-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636235 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636259 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8mt\" (UniqueName: \"kubernetes.io/projected/486b7838-3a4d-45be-b4b7-c2ec085d7a07-kube-api-access-7r8mt\") pod \"migrator-59844c95c7-pczrt\" (UID: \"486b7838-3a4d-45be-b4b7-c2ec085d7a07\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636287 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-serving-cert\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636335 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636359 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n8g9\" (UniqueName: \"kubernetes.io/projected/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-kube-api-access-8n8g9\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63ea8335-da26-4a4d-b35e-87870d3d61b1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636452 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-config\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636488 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-audit\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636602 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-config\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636627 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blmrn\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-kube-api-access-blmrn\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636646 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-config\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636975 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-oauth-serving-cert\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.637366 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-config\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.637518 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.637760 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.636656 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-config\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.635641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-console-config\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.638504 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.639929 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.640117 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-audit\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.640208 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ea8335-da26-4a4d-b35e-87870d3d61b1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.640821 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-serving-cert\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.640852 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-tls\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.641453 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: E0930 02:57:15.642048 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.142027776 +0000 UTC m=+163.315247760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.642215 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.642517 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-audit-policies\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.643316 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-default-certificate\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.643321 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-service-ca\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.643833 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-client\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.644952 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/607782d2-50af-4b1e-a3fe-603ad6267bc9-machine-approver-tls\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645095 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8z5v\" (UniqueName: \"kubernetes.io/projected/607782d2-50af-4b1e-a3fe-603ad6267bc9-kube-api-access-d8z5v\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645126 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-trusted-ca-bundle\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645181 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb7ab47-53ea-434c-8367-ad667abe4168-serving-cert\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645207 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2251154-99b6-4b82-ad16-19e36f3eaf8e-profile-collector-cert\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645230 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlg4s\" (UniqueName: \"kubernetes.io/projected/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-kube-api-access-tlg4s\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645325 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8df09393-7557-4bf8-8cbf-e2aa59df04b6-image-import-ca\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645831 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-config\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.645934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-client-ca\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.646240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-srv-cert\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.646904 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-certificates\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.646997 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-service-ca-bundle\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647010 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-serving-cert\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647022 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8df09393-7557-4bf8-8cbf-e2aa59df04b6-audit-dir\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647095 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-config\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647152 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0c87777-c4d8-4783-93a0-67e2b680f770-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647183 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f8801e-bb23-4a2f-bd03-9711d966d3c7-webhook-cert\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647632 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef71aa6-910d-4a67-bef9-2e37d689408b-service-ca-bundle\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647677 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8df09393-7557-4bf8-8cbf-e2aa59df04b6-audit-dir\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.647703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649129 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a46c375-83ec-4de9-8047-3abde5224588-config\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649200 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649236 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkcn\" (UniqueName: \"kubernetes.io/projected/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-kube-api-access-vgkcn\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649360 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16046530-d8fe-40bb-9a22-2a021648faa9-audit-dir\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649713 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16046530-d8fe-40bb-9a22-2a021648faa9-audit-dir\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649763 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-trusted-ca-bundle\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649896 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-certificates\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.649530 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-metrics-certs\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.650979 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-ca\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42f51b8-542f-4784-88cd-89832dfc1999-service-ca-bundle\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8552c2-312f-40d7-abdb-160f831e5c04-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651168 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-csi-data-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrlf\" (UniqueName: \"kubernetes.io/projected/63ea8335-da26-4a4d-b35e-87870d3d61b1-kube-api-access-zqrlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651260 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2msd\" (UniqueName: \"kubernetes.io/projected/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-kube-api-access-x2msd\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651308 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-serving-cert\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651342 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ba3df4a-66a1-47bc-924b-542c7ca89389-metrics-tls\") pod \"dns-operator-744455d44c-pzp4p\" (UID: \"1ba3df4a-66a1-47bc-924b-542c7ca89389\") " pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0918c73-ef87-42c8-8395-9499c5a91e2b-serving-cert\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651438 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-bound-sa-token\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651489 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-encryption-config\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651514 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/607782d2-50af-4b1e-a3fe-603ad6267bc9-auth-proxy-config\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651582 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/32a4e9f9-124b-47f6-821c-44714e635968-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n9lhl\" (UID: \"32a4e9f9-124b-47f6-821c-44714e635968\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.651889 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a46c375-83ec-4de9-8047-3abde5224588-config\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.652398 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.652661 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a46c375-83ec-4de9-8047-3abde5224588-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.652724 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eeb7ab47-53ea-434c-8367-ad667abe4168-etcd-ca\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.653169 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42f51b8-542f-4784-88cd-89832dfc1999-service-ca-bundle\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.653332 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.654207 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-stats-auth\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.654666 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/607782d2-50af-4b1e-a3fe-603ad6267bc9-auth-proxy-config\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.655160 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.657402 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.668562 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-oauth-config\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.668793 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef71aa6-910d-4a67-bef9-2e37d689408b-serving-cert\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.668823 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.668992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.669964 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42f51b8-542f-4784-88cd-89832dfc1999-metrics-certs\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.670317 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.670508 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ba3df4a-66a1-47bc-924b-542c7ca89389-metrics-tls\") pod \"dns-operator-744455d44c-pzp4p\" (UID: \"1ba3df4a-66a1-47bc-924b-542c7ca89389\") " pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.670608 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-serving-cert\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.670998 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63ea8335-da26-4a4d-b35e-87870d3d61b1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.671012 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63ddf643-780d-438a-bf7b-bf73096c9902-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wbwqr\" (UID: \"63ddf643-780d-438a-bf7b-bf73096c9902\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.671019 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-etcd-client\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.671189 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb7ab47-53ea-434c-8367-ad667abe4168-serving-cert\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.671885 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.674091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.681635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.681775 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.681857 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0918c73-ef87-42c8-8395-9499c5a91e2b-serving-cert\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.692672 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dk4s\" (UniqueName: \"kubernetes.io/projected/eeb7ab47-53ea-434c-8367-ad667abe4168-kube-api-access-7dk4s\") pod \"etcd-operator-b45778765-fpq6t\" (UID: \"eeb7ab47-53ea-434c-8367-ad667abe4168\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.697788 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.699206 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8df09393-7557-4bf8-8cbf-e2aa59df04b6-encryption-config\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.701650 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnxr6\" (UniqueName: \"kubernetes.io/projected/8ef71aa6-910d-4a67-bef9-2e37d689408b-kube-api-access-dnxr6\") pod \"authentication-operator-69f744f599-7d4sv\" (UID: \"8ef71aa6-910d-4a67-bef9-2e37d689408b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.721568 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmb6m\" (UniqueName: \"kubernetes.io/projected/63ddf643-780d-438a-bf7b-bf73096c9902-kube-api-access-wmb6m\") pod \"multus-admission-controller-857f4d67dd-wbwqr\" (UID: \"63ddf643-780d-438a-bf7b-bf73096c9902\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.743635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8w6s\" (UniqueName: \"kubernetes.io/projected/e42f51b8-542f-4784-88cd-89832dfc1999-kube-api-access-h8w6s\") pod \"router-default-5444994796-r5jzt\" (UID: \"e42f51b8-542f-4784-88cd-89832dfc1999\") " pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.754843 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755166 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4tvc\" (UniqueName: \"kubernetes.io/projected/fd4c491e-16e3-4e31-a4a9-314d53ceada8-kube-api-access-c4tvc\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755205 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-562sj\" (UniqueName: \"kubernetes.io/projected/1add32c6-5ed4-415a-a8f3-0de2fb3f71d9-kube-api-access-562sj\") pod \"ingress-canary-gpn94\" (UID: \"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9\") " pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755235 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f8801e-bb23-4a2f-bd03-9711d966d3c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95rg6\" (UniqueName: \"kubernetes.io/projected/32a4e9f9-124b-47f6-821c-44714e635968-kube-api-access-95rg6\") pod \"package-server-manager-789f6589d5-n9lhl\" (UID: \"32a4e9f9-124b-47f6-821c-44714e635968\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755286 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2676764-efb6-4e02-9012-74b8675e7bff-config-volume\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755317 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bfa0572-7577-49a6-9845-782e3ca7df2f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755397 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrlk8\" (UniqueName: \"kubernetes.io/projected/6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e-kube-api-access-nrlk8\") pod \"control-plane-machine-set-operator-78cbb6b69f-ssdfc\" (UID: \"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755430 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5965n\" (UniqueName: \"kubernetes.io/projected/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-kube-api-access-5965n\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-registration-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755477 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8mt\" (UniqueName: \"kubernetes.io/projected/486b7838-3a4d-45be-b4b7-c2ec085d7a07-kube-api-access-7r8mt\") pod \"migrator-59844c95c7-pczrt\" (UID: \"486b7838-3a4d-45be-b4b7-c2ec085d7a07\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n8g9\" (UniqueName: \"kubernetes.io/projected/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-kube-api-access-8n8g9\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755527 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-config\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755567 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2251154-99b6-4b82-ad16-19e36f3eaf8e-profile-collector-cert\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755598 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-srv-cert\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755624 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0c87777-c4d8-4783-93a0-67e2b680f770-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755646 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f8801e-bb23-4a2f-bd03-9711d966d3c7-webhook-cert\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755668 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkcn\" (UniqueName: \"kubernetes.io/projected/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-kube-api-access-vgkcn\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755727 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8552c2-312f-40d7-abdb-160f831e5c04-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755756 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-csi-data-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2msd\" (UniqueName: \"kubernetes.io/projected/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-kube-api-access-x2msd\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755799 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-serving-cert\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755835 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/32a4e9f9-124b-47f6-821c-44714e635968-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n9lhl\" (UID: \"32a4e9f9-124b-47f6-821c-44714e635968\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755863 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-proxy-tls\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755887 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpxl\" (UniqueName: \"kubernetes.io/projected/b2676764-efb6-4e02-9012-74b8675e7bff-kube-api-access-nnpxl\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755922 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa8552c2-312f-40d7-abdb-160f831e5c04-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755945 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8c390261-32ad-4d03-82b2-261cbafe52f9-node-bootstrap-token\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.755967 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-images\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756000 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2676764-efb6-4e02-9012-74b8675e7bff-secret-volume\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8c390261-32ad-4d03-82b2-261cbafe52f9-certs\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1add32c6-5ed4-415a-a8f3-0de2fb3f71d9-cert\") pod \"ingress-canary-gpn94\" (UID: \"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9\") " pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756071 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmfg\" (UniqueName: \"kubernetes.io/projected/d0c87777-c4d8-4783-93a0-67e2b680f770-kube-api-access-wqmfg\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756100 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-socket-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756126 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68454d10-9f26-41d7-9b42-1ee60a78a809-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756149 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58sf\" (UniqueName: \"kubernetes.io/projected/97f8801e-bb23-4a2f-bd03-9711d966d3c7-kube-api-access-r58sf\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756173 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2251154-99b6-4b82-ad16-19e36f3eaf8e-srv-cert\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756194 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-metrics-tls\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756224 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756258 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-metrics-tls\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756286 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c2251154-99b6-4b82-ad16-19e36f3eaf8e-kube-api-access-7jxsz\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfa0572-7577-49a6-9845-782e3ca7df2f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756359 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2676764-efb6-4e02-9012-74b8675e7bff-config-volume\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756440 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c87777-c4d8-4783-93a0-67e2b680f770-proxy-tls\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: E0930 02:57:15.756492 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.256472969 +0000 UTC m=+163.429692943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756542 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d6dw\" (UniqueName: \"kubernetes.io/projected/8c390261-32ad-4d03-82b2-261cbafe52f9-kube-api-access-8d6dw\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756567 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfgsl\" (UniqueName: \"kubernetes.io/projected/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-kube-api-access-vfgsl\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756589 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68454d10-9f26-41d7-9b42-1ee60a78a809-config\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756614 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfa0572-7577-49a6-9845-782e3ca7df2f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756634 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68454d10-9f26-41d7-9b42-1ee60a78a809-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756653 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d9401991-d332-4a04-85be-b3d5a7b00c27-signing-cabundle\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756670 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d9401991-d332-4a04-85be-b3d5a7b00c27-signing-key\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qxc\" (UniqueName: \"kubernetes.io/projected/d9401991-d332-4a04-85be-b3d5a7b00c27-kube-api-access-t8qxc\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756740 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-mountpoint-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756758 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97f8801e-bb23-4a2f-bd03-9711d966d3c7-tmpfs\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756792 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ssdfc\" (UID: \"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-plugins-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfdd\" (UniqueName: \"kubernetes.io/projected/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-kube-api-access-tmfdd\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756862 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-trusted-ca\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756881 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9ksk\" (UniqueName: \"kubernetes.io/projected/aa8552c2-312f-40d7-abdb-160f831e5c04-kube-api-access-r9ksk\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756897 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-config-volume\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.756916 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.757708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.759186 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-config-volume\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.759185 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97f8801e-bb23-4a2f-bd03-9711d966d3c7-tmpfs\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.759252 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-mountpoint-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.759677 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-csi-data-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.760138 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-registration-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.760209 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-plugins-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.760735 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-trusted-ca\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.761081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-config\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.762704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0c87777-c4d8-4783-93a0-67e2b680f770-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.763639 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-proxy-tls\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.765598 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-socket-dir\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.765802 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.765990 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8552c2-312f-40d7-abdb-160f831e5c04-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.766123 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.766343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1add32c6-5ed4-415a-a8f3-0de2fb3f71d9-cert\") pod \"ingress-canary-gpn94\" (UID: \"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9\") " pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.766492 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.766944 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68454d10-9f26-41d7-9b42-1ee60a78a809-config\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.767047 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-images\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.767089 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfa0572-7577-49a6-9845-782e3ca7df2f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.767758 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d9401991-d332-4a04-85be-b3d5a7b00c27-signing-cabundle\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.768050 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8c390261-32ad-4d03-82b2-261cbafe52f9-node-bootstrap-token\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.768061 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa8552c2-312f-40d7-abdb-160f831e5c04-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.768804 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-metrics-tls\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.769395 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-serving-cert\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.770239 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97f8801e-bb23-4a2f-bd03-9711d966d3c7-webhook-cert\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.771417 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2251154-99b6-4b82-ad16-19e36f3eaf8e-profile-collector-cert\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.771515 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2251154-99b6-4b82-ad16-19e36f3eaf8e-srv-cert\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.771564 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97f8801e-bb23-4a2f-bd03-9711d966d3c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.772031 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/32a4e9f9-124b-47f6-821c-44714e635968-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-n9lhl\" (UID: \"32a4e9f9-124b-47f6-821c-44714e635968\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.780102 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68454d10-9f26-41d7-9b42-1ee60a78a809-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.780157 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d9401991-d332-4a04-85be-b3d5a7b00c27-signing-key\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.780283 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8c390261-32ad-4d03-82b2-261cbafe52f9-certs\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.780932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfa0572-7577-49a6-9845-782e3ca7df2f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.781139 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ssdfc\" (UID: \"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.781156 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-srv-cert\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.781320 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-metrics-tls\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.782242 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2676764-efb6-4e02-9012-74b8675e7bff-secret-volume\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.785225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkk9\" (UniqueName: \"kubernetes.io/projected/16046530-d8fe-40bb-9a22-2a021648faa9-kube-api-access-2qkk9\") pod \"oauth-openshift-558db77b4-5rwhq\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.789808 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0c87777-c4d8-4783-93a0-67e2b680f770-proxy-tls\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.791859 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a46c375-83ec-4de9-8047-3abde5224588-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4w5fd\" (UID: \"6a46c375-83ec-4de9-8047-3abde5224588\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.820352 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l46mg\" (UniqueName: \"kubernetes.io/projected/e0918c73-ef87-42c8-8395-9499c5a91e2b-kube-api-access-l46mg\") pod \"route-controller-manager-6576b87f9c-pjvjd\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.825684 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pf7\" (UniqueName: \"kubernetes.io/projected/dd70937c-9e84-468b-b81f-b9f400436aec-kube-api-access-67pf7\") pod \"console-f9d7485db-7gvsx\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.845110 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blmrn\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-kube-api-access-blmrn\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.861453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: E0930 02:57:15.861935 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.361921506 +0000 UTC m=+163.535141480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.870455 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n26fr\" (UniqueName: \"kubernetes.io/projected/8df09393-7557-4bf8-8cbf-e2aa59df04b6-kube-api-access-n26fr\") pod \"apiserver-76f77b778f-xxnzf\" (UID: \"8df09393-7557-4bf8-8cbf-e2aa59df04b6\") " pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.877449 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stpjx\" (UniqueName: \"kubernetes.io/projected/f387f631-c1e7-4dbb-ade0-cdeb4f4d724d-kube-api-access-stpjx\") pod \"openshift-apiserver-operator-796bbdcf4f-k7tkh\" (UID: \"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.887192 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.900956 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fpq6t"] Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.903021 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9xs\" (UniqueName: \"kubernetes.io/projected/1ba3df4a-66a1-47bc-924b-542c7ca89389-kube-api-access-rb9xs\") pod \"dns-operator-744455d44c-pzp4p\" (UID: \"1ba3df4a-66a1-47bc-924b-542c7ca89389\") " pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.908097 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.916919 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.920614 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8z5v\" (UniqueName: \"kubernetes.io/projected/607782d2-50af-4b1e-a3fe-603ad6267bc9-kube-api-access-d8z5v\") pod \"machine-approver-56656f9798-zrd7j\" (UID: \"607782d2-50af-4b1e-a3fe-603ad6267bc9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: W0930 02:57:15.923079 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb7ab47_53ea_434c_8367_ad667abe4168.slice/crio-0fac1390b63004c383842d66523e28218b28d36f0bf312f252fbd9691ef645d3 WatchSource:0}: Error finding container 0fac1390b63004c383842d66523e28218b28d36f0bf312f252fbd9691ef645d3: Status 404 returned error can't find the container with id 0fac1390b63004c383842d66523e28218b28d36f0bf312f252fbd9691ef645d3 Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.923242 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.931035 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.937998 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.943053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlg4s\" (UniqueName: \"kubernetes.io/projected/ca4c6d8e-e39a-4302-af0b-029aa35ca1e6-kube-api-access-tlg4s\") pod \"openshift-config-operator-7777fb866f-wc6fj\" (UID: \"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.955726 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.963055 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrlf\" (UniqueName: \"kubernetes.io/projected/63ea8335-da26-4a4d-b35e-87870d3d61b1-kube-api-access-zqrlf\") pod \"openshift-controller-manager-operator-756b6f6bc6-92m7x\" (UID: \"63ea8335-da26-4a4d-b35e-87870d3d61b1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.964635 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.964729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:15 crc kubenswrapper[4744]: E0930 02:57:15.965046 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.46500104 +0000 UTC m=+163.638221184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:15 crc kubenswrapper[4744]: E0930 02:57:15.966109 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.466095643 +0000 UTC m=+163.639315617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.967728 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.973981 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.977750 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.984271 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.988970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-bound-sa-token\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:15 crc kubenswrapper[4744]: I0930 02:57:15.991833 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.025998 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bfa0572-7577-49a6-9845-782e3ca7df2f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fcck5\" (UID: \"7bfa0572-7577-49a6-9845-782e3ca7df2f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.044310 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4tvc\" (UniqueName: \"kubernetes.io/projected/fd4c491e-16e3-4e31-a4a9-314d53ceada8-kube-api-access-c4tvc\") pod \"marketplace-operator-79b997595-h5rlq\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.065014 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qxc\" (UniqueName: \"kubernetes.io/projected/d9401991-d332-4a04-85be-b3d5a7b00c27-kube-api-access-t8qxc\") pod \"service-ca-9c57cc56f-lq7lq\" (UID: \"d9401991-d332-4a04-85be-b3d5a7b00c27\") " pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.077788 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.078012 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.577974178 +0000 UTC m=+163.751194152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.078156 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.078859 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.578850405 +0000 UTC m=+163.752070379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.082613 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.089466 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9ksk\" (UniqueName: \"kubernetes.io/projected/aa8552c2-312f-40d7-abdb-160f831e5c04-kube-api-access-r9ksk\") pod \"kube-storage-version-migrator-operator-b67b599dd-c8gct\" (UID: \"aa8552c2-312f-40d7-abdb-160f831e5c04\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.106996 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.110101 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfdd\" (UniqueName: \"kubernetes.io/projected/c68467b0-9ad1-4164-bec6-3e0f0f2abe87-kube-api-access-tmfdd\") pod \"olm-operator-6b444d44fb-zd2rc\" (UID: \"c68467b0-9ad1-4164-bec6-3e0f0f2abe87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.137216 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2msd\" (UniqueName: \"kubernetes.io/projected/cfd94804-02a1-436d-b2a4-2fd4eb7502ab-kube-api-access-x2msd\") pod \"csi-hostpathplugin-jd84s\" (UID: \"cfd94804-02a1-436d-b2a4-2fd4eb7502ab\") " pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.145232 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d6dw\" (UniqueName: \"kubernetes.io/projected/8c390261-32ad-4d03-82b2-261cbafe52f9-kube-api-access-8d6dw\") pod \"machine-config-server-nhkgz\" (UID: \"8c390261-32ad-4d03-82b2-261cbafe52f9\") " pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.151350 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.181918 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nhkgz" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.182149 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.182471 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.682445815 +0000 UTC m=+163.855665789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.182734 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.183300 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.68328173 +0000 UTC m=+163.856501714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.189883 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-562sj\" (UniqueName: \"kubernetes.io/projected/1add32c6-5ed4-415a-a8f3-0de2fb3f71d9-kube-api-access-562sj\") pod \"ingress-canary-gpn94\" (UID: \"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9\") " pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.192757 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7d4sv"] Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.193794 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.203671 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrlk8\" (UniqueName: \"kubernetes.io/projected/6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e-kube-api-access-nrlk8\") pod \"control-plane-machine-set-operator-78cbb6b69f-ssdfc\" (UID: \"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.207848 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8mt\" (UniqueName: \"kubernetes.io/projected/486b7838-3a4d-45be-b4b7-c2ec085d7a07-kube-api-access-7r8mt\") pod \"migrator-59844c95c7-pczrt\" (UID: \"486b7838-3a4d-45be-b4b7-c2ec085d7a07\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.211265 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.226589 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5965n\" (UniqueName: \"kubernetes.io/projected/a3a14b54-caea-4d75-baa6-cf8ddd2cc70f-kube-api-access-5965n\") pod \"machine-config-operator-74547568cd-tb8tv\" (UID: \"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.258106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpxl\" (UniqueName: \"kubernetes.io/projected/b2676764-efb6-4e02-9012-74b8675e7bff-kube-api-access-nnpxl\") pod \"collect-profiles-29320005-gxcgx\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.285225 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.286018 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68454d10-9f26-41d7-9b42-1ee60a78a809-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lt5f2\" (UID: \"68454d10-9f26-41d7-9b42-1ee60a78a809\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.286083 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.786061405 +0000 UTC m=+163.959281379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.286674 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n8g9\" (UniqueName: \"kubernetes.io/projected/ef7559e3-bf04-42a4-bb27-33dd6c635ffd-kube-api-access-8n8g9\") pod \"dns-default-zpfzk\" (UID: \"ef7559e3-bf04-42a4-bb27-33dd6c635ffd\") " pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.304432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfgsl\" (UniqueName: \"kubernetes.io/projected/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-kube-api-access-vfgsl\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.305382 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.324774 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.332712 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.333346 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkcn\" (UniqueName: \"kubernetes.io/projected/12a32ac7-5e93-4dbf-af6c-3f60ac33e944-kube-api-access-vgkcn\") pod \"service-ca-operator-777779d784-txgv6\" (UID: \"12a32ac7-5e93-4dbf-af6c-3f60ac33e944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.342891 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95rg6\" (UniqueName: \"kubernetes.io/projected/32a4e9f9-124b-47f6-821c-44714e635968-kube-api-access-95rg6\") pod \"package-server-manager-789f6589d5-n9lhl\" (UID: \"32a4e9f9-124b-47f6-821c-44714e635968\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.349943 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.365502 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.382064 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.389057 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.389485 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.88947107 +0000 UTC m=+164.062691034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.395813 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.400294 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58sf\" (UniqueName: \"kubernetes.io/projected/97f8801e-bb23-4a2f-bd03-9711d966d3c7-kube-api-access-r58sf\") pod \"packageserver-d55dfcdfc-4jrqr\" (UID: \"97f8801e-bb23-4a2f-bd03-9711d966d3c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.400544 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmfg\" (UniqueName: \"kubernetes.io/projected/d0c87777-c4d8-4783-93a0-67e2b680f770-kube-api-access-wqmfg\") pod \"machine-config-controller-84d6567774-g9w4l\" (UID: \"d0c87777-c4d8-4783-93a0-67e2b680f770\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.416276 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.419464 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0ddbf2ad-5319-4838-9ff2-b154ae354bf1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f2tn5\" (UID: \"0ddbf2ad-5319-4838-9ff2-b154ae354bf1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.424664 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.433350 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jxsz\" (UniqueName: \"kubernetes.io/projected/c2251154-99b6-4b82-ad16-19e36f3eaf8e-kube-api-access-7jxsz\") pod \"catalog-operator-68c6474976-qrrpz\" (UID: \"c2251154-99b6-4b82-ad16-19e36f3eaf8e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.442123 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.457909 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gpn94" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.468630 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.491065 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.491458 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:16.99143863 +0000 UTC m=+164.164658604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.561675 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xxnzf"] Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.595273 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.596814 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.096796914 +0000 UTC m=+164.270016888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.617146 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.643606 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.646452 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" event={"ID":"ed0e7b5c-f54d-4ab7-853d-72d34eed714d","Type":"ContainerStarted","Data":"baa0e375dbce9fe277275f5618b0d39bd2f80c190926e18ecc91b40b0b34d2bc"} Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.682749 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.682957 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nhkgz" event={"ID":"8c390261-32ad-4d03-82b2-261cbafe52f9","Type":"ContainerStarted","Data":"354d92f82e533d17d66f45510bde34c8e19f8684b2710dca2a6c0aeac88315de"} Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.697020 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.697574 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.197551387 +0000 UTC m=+164.370771361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.700972 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r5jzt" event={"ID":"e42f51b8-542f-4784-88cd-89832dfc1999","Type":"ContainerStarted","Data":"150a4b619d6e299192264524d5c58956c7be8df831e08e53aa2ae2fcc13597b8"} Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.701750 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" event={"ID":"607782d2-50af-4b1e-a3fe-603ad6267bc9","Type":"ContainerStarted","Data":"221154bda4533a6c2f2e2e1f86c127ccff69d4f7b85ebb81ad53894f011e0c4f"} Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.702891 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" event={"ID":"eeb7ab47-53ea-434c-8367-ad667abe4168","Type":"ContainerStarted","Data":"0fac1390b63004c383842d66523e28218b28d36f0bf312f252fbd9691ef645d3"} Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.712643 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" event={"ID":"8ef71aa6-910d-4a67-bef9-2e37d689408b","Type":"ContainerStarted","Data":"8f04383ffa1f4b130b71ec3ec19532789a03f20e6bd10100c3b15842398b629f"} Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.714224 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-bpp4m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.714263 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bpp4m" podUID="a50ca402-327c-41ea-832c-15ad7932d8f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.725775 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.769322 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.800167 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.801968 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.301951081 +0000 UTC m=+164.475171055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.903103 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:16 crc kubenswrapper[4744]: E0930 02:57:16.904232 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.4041977 +0000 UTC m=+164.577417674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:16 crc kubenswrapper[4744]: I0930 02:57:16.908655 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cb2hl" podStartSLOduration=138.908637016 podStartE2EDuration="2m18.908637016s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:16.906102819 +0000 UTC m=+164.079322793" watchObservedRunningTime="2025-09-30 02:57:16.908637016 +0000 UTC m=+164.081856990" Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.005141 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.005636 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.505613243 +0000 UTC m=+164.678833307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.107819 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.108722 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.608684457 +0000 UTC m=+164.781904431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.179390 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" podStartSLOduration=139.179340006 podStartE2EDuration="2m19.179340006s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:17.169290167 +0000 UTC m=+164.342510141" watchObservedRunningTime="2025-09-30 02:57:17.179340006 +0000 UTC m=+164.352559990" Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.202881 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd"] Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.212039 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.212459 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.712441902 +0000 UTC m=+164.885661876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.225183 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5rwhq"] Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.227339 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wqfz8" podStartSLOduration=139.227317608 podStartE2EDuration="2m19.227317608s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:17.225500703 +0000 UTC m=+164.398720677" watchObservedRunningTime="2025-09-30 02:57:17.227317608 +0000 UTC m=+164.400537582" Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.297696 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jxxmx" podStartSLOduration=139.297675208 podStartE2EDuration="2m19.297675208s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:17.255788772 +0000 UTC m=+164.429008736" watchObservedRunningTime="2025-09-30 02:57:17.297675208 +0000 UTC m=+164.470895182" Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.298470 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bpp4m" podStartSLOduration=139.298461712 podStartE2EDuration="2m19.298461712s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:17.295480991 +0000 UTC m=+164.468700965" watchObservedRunningTime="2025-09-30 02:57:17.298461712 +0000 UTC m=+164.471681686" Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.312957 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.313488 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.813464863 +0000 UTC m=+164.986684837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.313738 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.314058 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.81405114 +0000 UTC m=+164.987271114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.409059 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh"] Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.418588 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.419308 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:17.919284871 +0000 UTC m=+165.092504845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.454553 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7gvsx"] Sep 30 02:57:17 crc kubenswrapper[4744]: W0930 02:57:17.493598 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd70937c_9e84_468b_b81f_b9f400436aec.slice/crio-ddf9dc7506d8440af88006361f8b25d192e53101d18c7ca62fa4466be6202c9c WatchSource:0}: Error finding container ddf9dc7506d8440af88006361f8b25d192e53101d18c7ca62fa4466be6202c9c: Status 404 returned error can't find the container with id ddf9dc7506d8440af88006361f8b25d192e53101d18c7ca62fa4466be6202c9c Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.520104 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.520487 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.020472247 +0000 UTC m=+165.193692221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.627900 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.628607 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.128570725 +0000 UTC m=+165.301790699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.681918 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.688402 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdncf" podStartSLOduration=139.68834359 podStartE2EDuration="2m19.68834359s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:17.663054904 +0000 UTC m=+164.836274878" watchObservedRunningTime="2025-09-30 02:57:17.68834359 +0000 UTC m=+164.861563564" Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.715621 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj"] Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.729655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.730201 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.230182634 +0000 UTC m=+165.403402608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.745721 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nhkgz" event={"ID":"8c390261-32ad-4d03-82b2-261cbafe52f9","Type":"ContainerStarted","Data":"36134e1a7be107a565d005989fce0b011c589d71adb641a63a41c3638f7b3400"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.764225 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" event={"ID":"16046530-d8fe-40bb-9a22-2a021648faa9","Type":"ContainerStarted","Data":"c2e0a9fdb0eb0d4cb73e2b4705ce2d5a53748626238ccb46f10aeef553ade428"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.766903 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r5jzt" event={"ID":"e42f51b8-542f-4784-88cd-89832dfc1999","Type":"ContainerStarted","Data":"5c13181b8525740e5ae53f2471bb95b25e75d123564a363a7d9c9d0e5a7b65a1"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.769927 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" event={"ID":"607782d2-50af-4b1e-a3fe-603ad6267bc9","Type":"ContainerStarted","Data":"f04c1b930d71cb8f0964c941a1bb617556f8b4cb1485b706bd42e78b497ca224"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.780647 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" event={"ID":"8ef71aa6-910d-4a67-bef9-2e37d689408b","Type":"ContainerStarted","Data":"eee7cd1325d76874325a1dfafcd87a4f6281dbb8b79abaab02eb14feae2300d8"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.781689 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7gvsx" event={"ID":"dd70937c-9e84-468b-b81f-b9f400436aec","Type":"ContainerStarted","Data":"ddf9dc7506d8440af88006361f8b25d192e53101d18c7ca62fa4466be6202c9c"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.783091 4744 generic.go:334] "Generic (PLEG): container finished" podID="8df09393-7557-4bf8-8cbf-e2aa59df04b6" containerID="f0b99a9d05a7a996a69b953dfadfc7e1177abe4ed87e6a63766871e98c19de75" exitCode=0 Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.783148 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" event={"ID":"8df09393-7557-4bf8-8cbf-e2aa59df04b6","Type":"ContainerDied","Data":"f0b99a9d05a7a996a69b953dfadfc7e1177abe4ed87e6a63766871e98c19de75"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.783167 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" event={"ID":"8df09393-7557-4bf8-8cbf-e2aa59df04b6","Type":"ContainerStarted","Data":"1cc8ea82dc3fcc9ed9076ba5a48fd27fb0bb336f74631adede3e2280d42b3986"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.790604 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" event={"ID":"eeb7ab47-53ea-434c-8367-ad667abe4168","Type":"ContainerStarted","Data":"7158220d0a1b7c41a22b3d02b504b2b36ae03d242911e640394dcccb822844a0"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.800356 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" event={"ID":"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d","Type":"ContainerStarted","Data":"30d649dfda42b50aaa483142c23b8fad36306d856a3cc7ff78070d05e7a8d416"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.801539 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" event={"ID":"e0918c73-ef87-42c8-8395-9499c5a91e2b","Type":"ContainerStarted","Data":"ebbe0cc6b2fbdcb6d685054617d0536dbb2338e24d1974116ea7b4bbbbac40a2"} Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.836429 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.837286 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.337262912 +0000 UTC m=+165.510482886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.941209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:17 crc kubenswrapper[4744]: E0930 02:57:17.943774 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.443759281 +0000 UTC m=+165.616979255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:17 crc kubenswrapper[4744]: I0930 02:57:17.990430 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.040877 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" podStartSLOduration=139.040849181 podStartE2EDuration="2m19.040849181s" podCreationTimestamp="2025-09-30 02:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.039690885 +0000 UTC m=+165.212910859" watchObservedRunningTime="2025-09-30 02:57:18.040849181 +0000 UTC m=+165.214069155" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.043054 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.043347 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.543335087 +0000 UTC m=+165.716555061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.146531 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.146944 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.646931297 +0000 UTC m=+165.820151271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.159823 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-nhkgz" podStartSLOduration=5.159801852 podStartE2EDuration="5.159801852s" podCreationTimestamp="2025-09-30 02:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.157944195 +0000 UTC m=+165.331164169" watchObservedRunningTime="2025-09-30 02:57:18.159801852 +0000 UTC m=+165.333021826" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.247609 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.248094 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.748075272 +0000 UTC m=+165.921295246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.288895 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r5jzt" podStartSLOduration=140.288873684 podStartE2EDuration="2m20.288873684s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.283937172 +0000 UTC m=+165.457157146" watchObservedRunningTime="2025-09-30 02:57:18.288873684 +0000 UTC m=+165.462093678" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.324035 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fpq6t" podStartSLOduration=140.324011802 podStartE2EDuration="2m20.324011802s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.32329107 +0000 UTC m=+165.496511044" watchObservedRunningTime="2025-09-30 02:57:18.324011802 +0000 UTC m=+165.497231776" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.349771 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.350285 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.850264438 +0000 UTC m=+166.023484412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.356989 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:18 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:18 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:18 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.357046 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.428539 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7d4sv" podStartSLOduration=140.42851199 podStartE2EDuration="2m20.42851199s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.381557659 +0000 UTC m=+165.554777653" watchObservedRunningTime="2025-09-30 02:57:18.42851199 +0000 UTC m=+165.601731964" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.455557 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.455948 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:18.955925772 +0000 UTC m=+166.129145746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.557872 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.558502 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.05848684 +0000 UTC m=+166.231706814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.660523 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.660833 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.16080059 +0000 UTC m=+166.334020564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.661293 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.661726 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.161711628 +0000 UTC m=+166.334931602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.747927 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd"] Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.750466 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lq7lq"] Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.761656 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.761993 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.261972546 +0000 UTC m=+166.435192520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.768577 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pzp4p"] Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.806574 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" event={"ID":"16046530-d8fe-40bb-9a22-2a021648faa9","Type":"ContainerStarted","Data":"695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93"} Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.807089 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.815112 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" event={"ID":"8df09393-7557-4bf8-8cbf-e2aa59df04b6","Type":"ContainerStarted","Data":"4621965b2e06c8c1437183abdeb3c3fdfed59866b1bdc4a55770cf936e9976cd"} Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.822296 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" event={"ID":"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6","Type":"ContainerStarted","Data":"ba50fa67923e6ed0ebd789bd19dfe930556ea30c19f18de41921379d1e315443"} Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.823873 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" event={"ID":"607782d2-50af-4b1e-a3fe-603ad6267bc9","Type":"ContainerStarted","Data":"b04acf596e13a412b67e9caf07815a4c3b62e337a29696f5a10f70871a902b64"} Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.828429 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7gvsx" event={"ID":"dd70937c-9e84-468b-b81f-b9f400436aec","Type":"ContainerStarted","Data":"ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531"} Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.833105 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" podStartSLOduration=140.83245376 podStartE2EDuration="2m20.83245376s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.830729547 +0000 UTC m=+166.003949521" watchObservedRunningTime="2025-09-30 02:57:18.83245376 +0000 UTC m=+166.005673734" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.833797 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" event={"ID":"f387f631-c1e7-4dbb-ade0-cdeb4f4d724d","Type":"ContainerStarted","Data":"3422d6540f67ae5714783c1b8d3365196b7077eddd8b2cac177e3220cf519d5d"} Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.845358 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" event={"ID":"e0918c73-ef87-42c8-8395-9499c5a91e2b","Type":"ContainerStarted","Data":"458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d"} Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.849397 4744 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5rwhq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.849472 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" podUID="16046530-d8fe-40bb-9a22-2a021648faa9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.864147 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.865638 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zrd7j" podStartSLOduration=140.865606118 podStartE2EDuration="2m20.865606118s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.855441746 +0000 UTC m=+166.028661720" watchObservedRunningTime="2025-09-30 02:57:18.865606118 +0000 UTC m=+166.038826092" Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.868352 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.368333171 +0000 UTC m=+166.541553145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.885596 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7gvsx" podStartSLOduration=140.8855647 podStartE2EDuration="2m20.8855647s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.883984331 +0000 UTC m=+166.057204305" watchObservedRunningTime="2025-09-30 02:57:18.8855647 +0000 UTC m=+166.058784674" Sep 30 02:57:18 crc kubenswrapper[4744]: W0930 02:57:18.912005 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a46c375_83ec_4de9_8047_3abde5224588.slice/crio-66a9a9f7133639d12e38a342531826d5b66295c287e9d40922467cae7d5b2d76 WatchSource:0}: Error finding container 66a9a9f7133639d12e38a342531826d5b66295c287e9d40922467cae7d5b2d76: Status 404 returned error can't find the container with id 66a9a9f7133639d12e38a342531826d5b66295c287e9d40922467cae7d5b2d76 Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.946045 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-k7tkh" podStartSLOduration=140.946012696 podStartE2EDuration="2m20.946012696s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.913389664 +0000 UTC m=+166.086609638" watchObservedRunningTime="2025-09-30 02:57:18.946012696 +0000 UTC m=+166.119232670" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.950527 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" podStartSLOduration=139.950519974 podStartE2EDuration="2m19.950519974s" podCreationTimestamp="2025-09-30 02:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:18.938770653 +0000 UTC m=+166.111990627" watchObservedRunningTime="2025-09-30 02:57:18.950519974 +0000 UTC m=+166.123739948" Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.967219 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:18 crc kubenswrapper[4744]: E0930 02:57:18.968124 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.468087814 +0000 UTC m=+166.641307788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.995610 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:18 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:18 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:18 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:18 crc kubenswrapper[4744]: I0930 02:57:18.995681 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.030696 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wbwqr"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.030773 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.069288 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.070767 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.570744645 +0000 UTC m=+166.743964619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: W0930 02:57:19.132190 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c6b1765_0a44_41b0_9f4c_d0e1cb8f434e.slice/crio-ddc362f1f371885d6448c615c5e83388c21eac8d56552fed036996f3fe6a950a WatchSource:0}: Error finding container ddc362f1f371885d6448c615c5e83388c21eac8d56552fed036996f3fe6a950a: Status 404 returned error can't find the container with id ddc362f1f371885d6448c615c5e83388c21eac8d56552fed036996f3fe6a950a Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.170612 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.170857 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.670824317 +0000 UTC m=+166.844044291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.171012 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.171533 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.671516437 +0000 UTC m=+166.844736411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.224958 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.244921 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.265744 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.268806 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.277562 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.277956 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.777937734 +0000 UTC m=+166.951157698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.279245 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jd84s"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.291829 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.291908 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.304854 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.309492 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.318886 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.321091 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.357565 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5rlq"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.378496 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.382008 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.881989309 +0000 UTC m=+167.055209283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.384596 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-txgv6"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.442333 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gpn94"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.456895 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.471792 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.481248 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.482945 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:19.982921077 +0000 UTC m=+167.156141051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.503181 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zpfzk"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.528281 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.531063 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc"] Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.544480 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct"] Sep 30 02:57:19 crc kubenswrapper[4744]: W0930 02:57:19.545619 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2251154_99b6_4b82_ad16_19e36f3eaf8e.slice/crio-e75a33b0724093dbff2c3476a4416c63af4a4df338bee89e6defda0d6e3e9168 WatchSource:0}: Error finding container e75a33b0724093dbff2c3476a4416c63af4a4df338bee89e6defda0d6e3e9168: Status 404 returned error can't find the container with id e75a33b0724093dbff2c3476a4416c63af4a4df338bee89e6defda0d6e3e9168 Sep 30 02:57:19 crc kubenswrapper[4744]: W0930 02:57:19.547710 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1add32c6_5ed4_415a_a8f3_0de2fb3f71d9.slice/crio-049b442f7facf794193485d43e7e1b767fc4b88d73e72e9e2136812bcf6362a3 WatchSource:0}: Error finding container 049b442f7facf794193485d43e7e1b767fc4b88d73e72e9e2136812bcf6362a3: Status 404 returned error can't find the container with id 049b442f7facf794193485d43e7e1b767fc4b88d73e72e9e2136812bcf6362a3 Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.555292 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr"] Sep 30 02:57:19 crc kubenswrapper[4744]: W0930 02:57:19.579153 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7559e3_bf04_42a4_bb27_33dd6c635ffd.slice/crio-e728685b1ae35c4507f92223e91d6ac5dcda2e3f96d6ce499f5426470c711e5b WatchSource:0}: Error finding container e728685b1ae35c4507f92223e91d6ac5dcda2e3f96d6ce499f5426470c711e5b: Status 404 returned error can't find the container with id e728685b1ae35c4507f92223e91d6ac5dcda2e3f96d6ce499f5426470c711e5b Sep 30 02:57:19 crc kubenswrapper[4744]: W0930 02:57:19.586749 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f8801e_bb23_4a2f_bd03_9711d966d3c7.slice/crio-148a99f09a37d452cb180fff9c8c0f9c0276758854146b345920a62d120de7cb WatchSource:0}: Error finding container 148a99f09a37d452cb180fff9c8c0f9c0276758854146b345920a62d120de7cb: Status 404 returned error can't find the container with id 148a99f09a37d452cb180fff9c8c0f9c0276758854146b345920a62d120de7cb Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.590449 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.592076 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.092059386 +0000 UTC m=+167.265279360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.694699 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.695923 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.195888024 +0000 UTC m=+167.369107988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.696205 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.697240 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.197210694 +0000 UTC m=+167.370430668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.798666 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.799014 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.298992899 +0000 UTC m=+167.472212873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.869923 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" event={"ID":"d9401991-d332-4a04-85be-b3d5a7b00c27","Type":"ContainerStarted","Data":"246a4e18064f9968bc42bb43146e782f46b2307eca4919a88d1f20bf70e5e0d6"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.869976 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" event={"ID":"d9401991-d332-4a04-85be-b3d5a7b00c27","Type":"ContainerStarted","Data":"acf221c1f8dedda211a35854df9abddf05bea1c37e79bae7d925077ee2d6e778"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.877621 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" event={"ID":"6a46c375-83ec-4de9-8047-3abde5224588","Type":"ContainerStarted","Data":"cbd13cf3564a4f9abf810f92bbe27dfaad84281bde299c478bebc573b321eca1"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.877646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" event={"ID":"6a46c375-83ec-4de9-8047-3abde5224588","Type":"ContainerStarted","Data":"66a9a9f7133639d12e38a342531826d5b66295c287e9d40922467cae7d5b2d76"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.894512 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" event={"ID":"cfd94804-02a1-436d-b2a4-2fd4eb7502ab","Type":"ContainerStarted","Data":"be30a4047fa50acbb2c71117bf3deef7b7e3b501e792c95b86fb60a3156bb5cf"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.897657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" event={"ID":"fd4c491e-16e3-4e31-a4a9-314d53ceada8","Type":"ContainerStarted","Data":"823641158deffaee96c404bc5b914f8160f4fdb36953bbbd4b7f0c74d293813c"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.899960 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" event={"ID":"c2251154-99b6-4b82-ad16-19e36f3eaf8e","Type":"ContainerStarted","Data":"e75a33b0724093dbff2c3476a4416c63af4a4df338bee89e6defda0d6e3e9168"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.900125 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:19 crc kubenswrapper[4744]: E0930 02:57:19.901348 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.40132801 +0000 UTC m=+167.574547984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.910952 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lq7lq" podStartSLOduration=140.910933535 podStartE2EDuration="2m20.910933535s" podCreationTimestamp="2025-09-30 02:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:19.891395225 +0000 UTC m=+167.064615199" watchObservedRunningTime="2025-09-30 02:57:19.910933535 +0000 UTC m=+167.084153509" Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.918055 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" event={"ID":"8df09393-7557-4bf8-8cbf-e2aa59df04b6","Type":"ContainerStarted","Data":"ed4fd2f6bdee088581727a4bb5a64392f6a1d7c85e49dd8846ab576ed31709bf"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.927819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" event={"ID":"b2676764-efb6-4e02-9012-74b8675e7bff","Type":"ContainerStarted","Data":"eae9c434ca83d18c314f4f13a2361ec2f930d0e33821052c947cf4567f861c9f"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.946654 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" podStartSLOduration=141.946565798 podStartE2EDuration="2m21.946565798s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:19.946266419 +0000 UTC m=+167.119486403" watchObservedRunningTime="2025-09-30 02:57:19.946565798 +0000 UTC m=+167.119785772" Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.946832 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4w5fd" podStartSLOduration=141.946825247 podStartE2EDuration="2m21.946825247s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:19.915094582 +0000 UTC m=+167.088314556" watchObservedRunningTime="2025-09-30 02:57:19.946825247 +0000 UTC m=+167.120045221" Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.947089 4744 generic.go:334] "Generic (PLEG): container finished" podID="ca4c6d8e-e39a-4302-af0b-029aa35ca1e6" containerID="86c9252a61d32111b1e7e3105586864f5d73282c8fcb7348856f9a5b45d8d179" exitCode=0 Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.947212 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" event={"ID":"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6","Type":"ContainerDied","Data":"86c9252a61d32111b1e7e3105586864f5d73282c8fcb7348856f9a5b45d8d179"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.948813 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zpfzk" event={"ID":"ef7559e3-bf04-42a4-bb27-33dd6c635ffd","Type":"ContainerStarted","Data":"e728685b1ae35c4507f92223e91d6ac5dcda2e3f96d6ce499f5426470c711e5b"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.953807 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" event={"ID":"7bfa0572-7577-49a6-9845-782e3ca7df2f","Type":"ContainerStarted","Data":"51d8f74878714f33928bbedd0f08b3917b1e6a2974f2c470f08bac41004f950e"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.986730 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" event={"ID":"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e","Type":"ContainerStarted","Data":"ddc362f1f371885d6448c615c5e83388c21eac8d56552fed036996f3fe6a950a"} Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.996627 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:19 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:19 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:19 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:19 crc kubenswrapper[4744]: I0930 02:57:19.996703 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:19.999010 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" event={"ID":"0ddbf2ad-5319-4838-9ff2-b154ae354bf1","Type":"ContainerStarted","Data":"36fb73ada69ec5ebe8bc269a94509fe68d5c1ef7a25f556124341ffffbde9104"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.014922 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.016298 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.516276839 +0000 UTC m=+167.689496813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.023085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" event={"ID":"1ba3df4a-66a1-47bc-924b-542c7ca89389","Type":"ContainerStarted","Data":"a5a5b4d247393c7748eed10b1efd7898d8d065d774e6f23fe2b86b6ee404f91f"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.023716 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" event={"ID":"1ba3df4a-66a1-47bc-924b-542c7ca89389","Type":"ContainerStarted","Data":"f6d3170008288b5ef568b9a5806cb6607fc6974ea85b81bd7a53f585679c4c97"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.028163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" event={"ID":"97f8801e-bb23-4a2f-bd03-9711d966d3c7","Type":"ContainerStarted","Data":"148a99f09a37d452cb180fff9c8c0f9c0276758854146b345920a62d120de7cb"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.038282 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" event={"ID":"486b7838-3a4d-45be-b4b7-c2ec085d7a07","Type":"ContainerStarted","Data":"636eccaab9caea860f3256119874347861b7fafaa8033d0ab44640fe7826e029"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.041891 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" event={"ID":"63ea8335-da26-4a4d-b35e-87870d3d61b1","Type":"ContainerStarted","Data":"1e8643fb2f1f48ae967aa4abd4ad3e50dc75cfc0e0f9f30590f8b55deb370c25"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.041940 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" event={"ID":"63ea8335-da26-4a4d-b35e-87870d3d61b1","Type":"ContainerStarted","Data":"561e161b6bcf4967fd12967635942fb4a5fdaae9c478f3cee4d70156d0833ec1"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.070268 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" event={"ID":"d0c87777-c4d8-4783-93a0-67e2b680f770","Type":"ContainerStarted","Data":"ca58e7157a0cb2c1ae0d50d63e4276522517784e572003feaa3f3d408ebbb71c"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.073880 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" event={"ID":"aa8552c2-312f-40d7-abdb-160f831e5c04","Type":"ContainerStarted","Data":"37269cdbe736879e9fe48c272101c01ace7c3dfc835c14823f7032c53c185b34"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.112744 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" event={"ID":"12a32ac7-5e93-4dbf-af6c-3f60ac33e944","Type":"ContainerStarted","Data":"c85e00c86bf9be7609f763cddf77d9f72995887b496a95d3835fb022505cd9b5"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.117588 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.118954 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" event={"ID":"63ddf643-780d-438a-bf7b-bf73096c9902","Type":"ContainerStarted","Data":"126846f73d8ff0bafd31153511f04f14edc7c761a910df0719f3db832da3fbef"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.118989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" event={"ID":"63ddf643-780d-438a-bf7b-bf73096c9902","Type":"ContainerStarted","Data":"79874f920593c517e29984422b1bfe4cfb0989f33dcedf96eb19c805ba998c42"} Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.119334 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.619319901 +0000 UTC m=+167.792539875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.148172 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" event={"ID":"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f","Type":"ContainerStarted","Data":"45edbd88de16663dc9ef975ba9cf5d473fc9b4a576b147cb8b88e909af6e467d"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.152329 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" event={"ID":"68454d10-9f26-41d7-9b42-1ee60a78a809","Type":"ContainerStarted","Data":"6d2edb369e72538279f21e50567c8d0ef522aa3ebc29bb11f6f7fa4a753f14c2"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.159526 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" event={"ID":"c68467b0-9ad1-4164-bec6-3e0f0f2abe87","Type":"ContainerStarted","Data":"c30e3e3a86c893d62c9137af92a538b67d590eb419f4a127c6c03fc045f35c26"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.173474 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-92m7x" podStartSLOduration=142.173446113 podStartE2EDuration="2m22.173446113s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:20.062693374 +0000 UTC m=+167.235913348" watchObservedRunningTime="2025-09-30 02:57:20.173446113 +0000 UTC m=+167.346666087" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.173787 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" podStartSLOduration=142.173779243 podStartE2EDuration="2m22.173779243s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:20.172662319 +0000 UTC m=+167.345882283" watchObservedRunningTime="2025-09-30 02:57:20.173779243 +0000 UTC m=+167.346999207" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.174031 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" event={"ID":"32a4e9f9-124b-47f6-821c-44714e635968","Type":"ContainerStarted","Data":"86649084d7221cff09a9d68cf45c7fc091234d32581b163423a396c406c4b683"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.180197 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gpn94" event={"ID":"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9","Type":"ContainerStarted","Data":"049b442f7facf794193485d43e7e1b767fc4b88d73e72e9e2136812bcf6362a3"} Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.180678 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.194920 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.195024 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lx8qb" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.202866 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.218822 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.221695 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.721674724 +0000 UTC m=+167.894894698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.324564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.324957 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.824942664 +0000 UTC m=+167.998162628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.426948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.428154 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:20.928130711 +0000 UTC m=+168.101350685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.529424 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.529920 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.029905905 +0000 UTC m=+168.203125879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.631420 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.631644 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.131609677 +0000 UTC m=+168.304829651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.631937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.632344 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.132328609 +0000 UTC m=+168.305548583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.739231 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.739914 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.23987696 +0000 UTC m=+168.413096934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.740823 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.741168 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.24115418 +0000 UTC m=+168.414374154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.844527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.844779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.844945 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.344927265 +0000 UTC m=+168.518147239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.854880 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d91f1289-b199-4e91-9bbd-78ec9a433706-metrics-certs\") pod \"network-metrics-daemon-zd85c\" (UID: \"d91f1289-b199-4e91-9bbd-78ec9a433706\") " pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.924753 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.925102 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.946557 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:20 crc kubenswrapper[4744]: E0930 02:57:20.947129 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.447109191 +0000 UTC m=+168.620329165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.992280 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:20 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:20 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:20 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:20 crc kubenswrapper[4744]: I0930 02:57:20.992391 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.048417 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.048567 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.548546405 +0000 UTC m=+168.721766379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.048806 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.049164 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.549156414 +0000 UTC m=+168.722376388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.127436 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zd85c" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.150430 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.150917 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.650896377 +0000 UTC m=+168.824116341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.225154 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" event={"ID":"fd4c491e-16e3-4e31-a4a9-314d53ceada8","Type":"ContainerStarted","Data":"8999f8e3488365ece7a71dbcc74dff5974c59490364a69cbbbd0e9f7d1ae0b50"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.226929 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.228571 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" event={"ID":"b2676764-efb6-4e02-9012-74b8675e7bff","Type":"ContainerStarted","Data":"3f96b830334a792bece915589388a820996eb1f7aac1428b07a7c6bdb6871a19"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.239597 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-h5rlq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.239684 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" podUID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.253657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.254194 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.754168257 +0000 UTC m=+168.927388461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.255911 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" event={"ID":"d0c87777-c4d8-4783-93a0-67e2b680f770","Type":"ContainerStarted","Data":"5aaa2b2fa25bccfb47ae55af907ba0a945a4867254922dfb8c97c8f74837a7ef"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.255961 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" event={"ID":"d0c87777-c4d8-4783-93a0-67e2b680f770","Type":"ContainerStarted","Data":"689c649d4d6b0b88f1efe00a6925043f9376658c4e269977b2d72f9def84d35e"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.258116 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" podStartSLOduration=143.258098318 podStartE2EDuration="2m23.258098318s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.256893531 +0000 UTC m=+168.430113505" watchObservedRunningTime="2025-09-30 02:57:21.258098318 +0000 UTC m=+168.431318292" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.267541 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" event={"ID":"aa8552c2-312f-40d7-abdb-160f831e5c04","Type":"ContainerStarted","Data":"c42b5373f1ecc999b8866ce8817395f99d55bfe3ebfe6ed68f53a794e5423067"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.277164 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" event={"ID":"1ba3df4a-66a1-47bc-924b-542c7ca89389","Type":"ContainerStarted","Data":"681268ee537bdf3c4a570d11fbfc4fa247d502479d7e01e3264a28d6625de958"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.282492 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-g9w4l" podStartSLOduration=143.282474006 podStartE2EDuration="2m23.282474006s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.281014881 +0000 UTC m=+168.454234855" watchObservedRunningTime="2025-09-30 02:57:21.282474006 +0000 UTC m=+168.455693980" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.315799 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gpn94" event={"ID":"1add32c6-5ed4-415a-a8f3-0de2fb3f71d9","Type":"ContainerStarted","Data":"8164b84ff1deb8faaec607f12b58dc541b7ea94d3e52dfddcf0d4717203bee96"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.320463 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" podStartSLOduration=143.320444611 podStartE2EDuration="2m23.320444611s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.316906682 +0000 UTC m=+168.490126656" watchObservedRunningTime="2025-09-30 02:57:21.320444611 +0000 UTC m=+168.493664585" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.321668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" event={"ID":"12a32ac7-5e93-4dbf-af6c-3f60ac33e944","Type":"ContainerStarted","Data":"720fac6e3f4fa60a92f8b3144b6af01cd6089f5dbbc729cae657c7d88a6a751b"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.325161 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" event={"ID":"63ddf643-780d-438a-bf7b-bf73096c9902","Type":"ContainerStarted","Data":"a39fda84af969d7ab67be843614df14b40a42892ee42e15962da1382917854d1"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.337015 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pzp4p" podStartSLOduration=143.336994429 podStartE2EDuration="2m23.336994429s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.336081911 +0000 UTC m=+168.509301885" watchObservedRunningTime="2025-09-30 02:57:21.336994429 +0000 UTC m=+168.510214403" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.354882 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.356703 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.856673833 +0000 UTC m=+169.029893807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.368862 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" event={"ID":"ca4c6d8e-e39a-4302-af0b-029aa35ca1e6","Type":"ContainerStarted","Data":"802ff9e923c03763b2c2f2e5412c2bee2fcf68474a727f5a6b8940ee16340bce"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.368937 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.397495 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gpn94" podStartSLOduration=8.397462015 podStartE2EDuration="8.397462015s" podCreationTimestamp="2025-09-30 02:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.37184462 +0000 UTC m=+168.545064594" watchObservedRunningTime="2025-09-30 02:57:21.397462015 +0000 UTC m=+168.570681979" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.398759 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c8gct" podStartSLOduration=143.398749965 podStartE2EDuration="2m23.398749965s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.396514356 +0000 UTC m=+168.569734330" watchObservedRunningTime="2025-09-30 02:57:21.398749965 +0000 UTC m=+168.571969939" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.401751 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zpfzk" event={"ID":"ef7559e3-bf04-42a4-bb27-33dd6c635ffd","Type":"ContainerStarted","Data":"7a86dd23f63791f2a1e3e866e7ba6768981529b9bb6ecc8c05870362c693b043"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.402103 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.417517 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" event={"ID":"97f8801e-bb23-4a2f-bd03-9711d966d3c7","Type":"ContainerStarted","Data":"ed5b7ee1ca8c9317cda59e58fb38f91eb4547e4d6b947c5bc2f1e855247b67ff"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.418419 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.426605 4744 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4jrqr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.426680 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" podUID="97f8801e-bb23-4a2f-bd03-9711d966d3c7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.428208 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wbwqr" podStartSLOduration=143.428187688 podStartE2EDuration="2m23.428187688s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.427007803 +0000 UTC m=+168.600227777" watchObservedRunningTime="2025-09-30 02:57:21.428187688 +0000 UTC m=+168.601407662" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.429613 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" event={"ID":"c68467b0-9ad1-4164-bec6-3e0f0f2abe87","Type":"ContainerStarted","Data":"3fc5aa053d702d7dc3cab7791bb1d46e92a0cc669c7ec77faed168bf82736d7f"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.430324 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.435913 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" event={"ID":"7bfa0572-7577-49a6-9845-782e3ca7df2f","Type":"ContainerStarted","Data":"60f0779fdbe031d7a1b9fa7220ab5799860690c29079ec8717d6d58b6b5b5311"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.446746 4744 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zd2rc container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.446831 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" podUID="c68467b0-9ad1-4164-bec6-3e0f0f2abe87" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.449848 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" podStartSLOduration=143.449829883 podStartE2EDuration="2m23.449829883s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.447069148 +0000 UTC m=+168.620289122" watchObservedRunningTime="2025-09-30 02:57:21.449829883 +0000 UTC m=+168.623049857" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.457420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.459256 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:21.959240372 +0000 UTC m=+169.132460346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.462713 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" event={"ID":"32a4e9f9-124b-47f6-821c-44714e635968","Type":"ContainerStarted","Data":"fc6202a3993502dd12714b1e3b890cccac5a2df7f3a2267bc91edf2daddc9bdf"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.462756 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" event={"ID":"32a4e9f9-124b-47f6-821c-44714e635968","Type":"ContainerStarted","Data":"7f3117fc3cbb624d637fad3e101f39d2d9920c2f79fe15a6d59b6c2e122ef72f"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.463288 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.471443 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" event={"ID":"486b7838-3a4d-45be-b4b7-c2ec085d7a07","Type":"ContainerStarted","Data":"616bd9962afde3c20b9a9edaa26787287bab45622aa9d811d2a226bcc2257c8b"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.471476 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" event={"ID":"486b7838-3a4d-45be-b4b7-c2ec085d7a07","Type":"ContainerStarted","Data":"fc104dbe1803b595b4f3ff771141cbfee6e4f6d6f1bba6e2b8683cbf06e1f009"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.474405 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" event={"ID":"6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e","Type":"ContainerStarted","Data":"ef8bc5817240304e2f8309980928de1ec374ba0d0b7daec15cdf17de613c1777"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.488785 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" event={"ID":"0ddbf2ad-5319-4838-9ff2-b154ae354bf1","Type":"ContainerStarted","Data":"15443ea5c572a9028605da70d30b63a2b67d83cdfe91657378c95dfe6e3a80bc"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.488845 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" event={"ID":"0ddbf2ad-5319-4838-9ff2-b154ae354bf1","Type":"ContainerStarted","Data":"909f43ef6f7a0daf3eb28f32c081603c7cbac8d66621865709bfa0e703fa7da4"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.499872 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" event={"ID":"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f","Type":"ContainerStarted","Data":"85b0bbe1d1545c944ab11c54901552c85f7333598be2f27ca0f1d84a815f71c9"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.499945 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" event={"ID":"a3a14b54-caea-4d75-baa6-cf8ddd2cc70f","Type":"ContainerStarted","Data":"1fffcbe043f1b22c15ee6676ca9cbcbde159af5a79210b5357253caa0d7d5e0d"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.520357 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lt5f2" event={"ID":"68454d10-9f26-41d7-9b42-1ee60a78a809","Type":"ContainerStarted","Data":"133e55e6d9bd9a51b0a41e0fb0054c4921cb40e6a99b6732e260a8e72a4b1c2a"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.520416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" event={"ID":"c2251154-99b6-4b82-ad16-19e36f3eaf8e","Type":"ContainerStarted","Data":"d62edaa7c10959838a63e69bf925c1d672ffd9f0d16f1fe62c2678087073f946"} Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.522914 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.523428 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-txgv6" podStartSLOduration=142.523405581 podStartE2EDuration="2m22.523405581s" podCreationTimestamp="2025-09-30 02:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.487906232 +0000 UTC m=+168.661126206" watchObservedRunningTime="2025-09-30 02:57:21.523405581 +0000 UTC m=+168.696625555" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.544809 4744 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qrrpz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.544881 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" podUID="c2251154-99b6-4b82-ad16-19e36f3eaf8e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.553424 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pczrt" podStartSLOduration=143.553400153 podStartE2EDuration="2m23.553400153s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.552849245 +0000 UTC m=+168.726069219" watchObservedRunningTime="2025-09-30 02:57:21.553400153 +0000 UTC m=+168.726620127" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.554785 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" podStartSLOduration=143.554780144 podStartE2EDuration="2m23.554780144s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.525535577 +0000 UTC m=+168.698755551" watchObservedRunningTime="2025-09-30 02:57:21.554780144 +0000 UTC m=+168.728000118" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.558266 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.559541 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.059514209 +0000 UTC m=+169.232734183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.606860 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" podStartSLOduration=143.606826462 podStartE2EDuration="2m23.606826462s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.592150552 +0000 UTC m=+168.765370516" watchObservedRunningTime="2025-09-30 02:57:21.606826462 +0000 UTC m=+168.780046436" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.651314 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fcck5" podStartSLOduration=143.651289767 podStartE2EDuration="2m23.651289767s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.638314539 +0000 UTC m=+168.811534513" watchObservedRunningTime="2025-09-30 02:57:21.651289767 +0000 UTC m=+168.824509741" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.660874 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.661261 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.161246523 +0000 UTC m=+169.334466497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.670424 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" podStartSLOduration=143.670399353 podStartE2EDuration="2m23.670399353s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.659997185 +0000 UTC m=+168.833217159" watchObservedRunningTime="2025-09-30 02:57:21.670399353 +0000 UTC m=+168.843619327" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.686528 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zpfzk" podStartSLOduration=8.686505348 podStartE2EDuration="8.686505348s" podCreationTimestamp="2025-09-30 02:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.683750283 +0000 UTC m=+168.856970257" watchObservedRunningTime="2025-09-30 02:57:21.686505348 +0000 UTC m=+168.859725322" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.714105 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ssdfc" podStartSLOduration=143.714085655 podStartE2EDuration="2m23.714085655s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.713343952 +0000 UTC m=+168.886563946" watchObservedRunningTime="2025-09-30 02:57:21.714085655 +0000 UTC m=+168.887305629" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.734214 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tb8tv" podStartSLOduration=143.734186572 podStartE2EDuration="2m23.734186572s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.730929671 +0000 UTC m=+168.904149645" watchObservedRunningTime="2025-09-30 02:57:21.734186572 +0000 UTC m=+168.907406546" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.760632 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" podStartSLOduration=143.760610763 podStartE2EDuration="2m23.760610763s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.759164699 +0000 UTC m=+168.932384673" watchObservedRunningTime="2025-09-30 02:57:21.760610763 +0000 UTC m=+168.933830737" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.763771 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.764248 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.264229314 +0000 UTC m=+169.437449288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.789506 4744 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xxnzf container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]log ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]etcd ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/max-in-flight-filter ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 02:57:21 crc kubenswrapper[4744]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 02:57:21 crc kubenswrapper[4744]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/openshift.io-startinformers ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 02:57:21 crc kubenswrapper[4744]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 02:57:21 crc kubenswrapper[4744]: livez check failed Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.789586 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" podUID="8df09393-7557-4bf8-8cbf-e2aa59df04b6" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.790825 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zd85c"] Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.800254 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f2tn5" podStartSLOduration=143.800236739 podStartE2EDuration="2m23.800236739s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:21.795357699 +0000 UTC m=+168.968577663" watchObservedRunningTime="2025-09-30 02:57:21.800236739 +0000 UTC m=+168.973456713" Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.871157 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.871591 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.371576099 +0000 UTC m=+169.544796073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.974219 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.974440 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.474407695 +0000 UTC m=+169.647627669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.974546 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:21 crc kubenswrapper[4744]: E0930 02:57:21.974926 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.474918071 +0000 UTC m=+169.648138045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.990005 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:21 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:21 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:21 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:21 crc kubenswrapper[4744]: I0930 02:57:21.990083 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.075835 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.076106 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.576065866 +0000 UTC m=+169.749285840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.076209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.076598 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.576581662 +0000 UTC m=+169.749801636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.177084 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.177322 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.677277143 +0000 UTC m=+169.850497127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.177950 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.178347 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.678335845 +0000 UTC m=+169.851555999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.279529 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.279780 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.779742008 +0000 UTC m=+169.952961982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.279955 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.280410 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.780395478 +0000 UTC m=+169.953615452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.382038 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.382274 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.882237594 +0000 UTC m=+170.055457568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.382582 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.383027 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.883015499 +0000 UTC m=+170.056235473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.483627 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.483986 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:22.983964677 +0000 UTC m=+170.157184651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.531031 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" event={"ID":"cfd94804-02a1-436d-b2a4-2fd4eb7502ab","Type":"ContainerStarted","Data":"1bfbfaff0baa750173db830b716f6412c223d7fb123bdba811f15474ac276f61"} Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.540827 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zd85c" event={"ID":"d91f1289-b199-4e91-9bbd-78ec9a433706","Type":"ContainerStarted","Data":"1dc46f1fa77cbfe8a7135494fcf750f9f066532e02dc7ec47f521f461b40846a"} Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.540905 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zd85c" event={"ID":"d91f1289-b199-4e91-9bbd-78ec9a433706","Type":"ContainerStarted","Data":"159c361d7df84e4507cee1fd69530b12959e590ad0025764db93491be75785e2"} Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.540918 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zd85c" event={"ID":"d91f1289-b199-4e91-9bbd-78ec9a433706","Type":"ContainerStarted","Data":"74072723f9375ba61934a323b60286e394dcfca518b79a0460576ca135ce9f25"} Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.547400 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zpfzk" event={"ID":"ef7559e3-bf04-42a4-bb27-33dd6c635ffd","Type":"ContainerStarted","Data":"3f16ff102f6b1178f9c4ed1ec0a03b7ced80b3d22c47449d1d3b5b839d810c8a"} Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.548281 4744 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-h5rlq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.548333 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" podUID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.554481 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qrrpz" Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.560906 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zd2rc" Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.572806 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zd85c" podStartSLOduration=144.572791254 podStartE2EDuration="2m24.572791254s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:22.57200322 +0000 UTC m=+169.745223194" watchObservedRunningTime="2025-09-30 02:57:22.572791254 +0000 UTC m=+169.746011228" Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.585204 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.589663 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.089646111 +0000 UTC m=+170.262866085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.687462 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.688469 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.188443364 +0000 UTC m=+170.361663338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.789728 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.790398 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.290357762 +0000 UTC m=+170.463577916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.890729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.890979 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.390935789 +0000 UTC m=+170.564155763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.891265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.891679 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.391662841 +0000 UTC m=+170.564882815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.990459 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:22 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:22 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:22 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.990563 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.992387 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.992589 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.492554379 +0000 UTC m=+170.665774353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:22 crc kubenswrapper[4744]: I0930 02:57:22.992646 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:22 crc kubenswrapper[4744]: E0930 02:57:22.993060 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.493051334 +0000 UTC m=+170.666271308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.094326 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.094586 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.594547269 +0000 UTC m=+170.767767243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.094731 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.095124 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.595115657 +0000 UTC m=+170.768335631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.136946 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jrqr" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.223885 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.224224 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.724194569 +0000 UTC m=+170.897414543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.224747 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.225121 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.725104527 +0000 UTC m=+170.898324501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.326022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.326203 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.826171439 +0000 UTC m=+170.999391413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.326265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.326716 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.826700785 +0000 UTC m=+170.999920759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.398145 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dj824"] Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.399406 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.405877 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.413524 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj824"] Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.428391 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.428897 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:23.928875472 +0000 UTC m=+171.102095446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.489247 4744 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.529952 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvzfq\" (UniqueName: \"kubernetes.io/projected/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-kube-api-access-hvzfq\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.530047 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-catalog-content\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.530104 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-utilities\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.530158 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.530604 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:24.030588635 +0000 UTC m=+171.203808609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.556097 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" event={"ID":"cfd94804-02a1-436d-b2a4-2fd4eb7502ab","Type":"ContainerStarted","Data":"c2aee22aacae20287bbf52290313e38f52843096afc0a39fc18e7a366a5cf1bb"} Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.556145 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" event={"ID":"cfd94804-02a1-436d-b2a4-2fd4eb7502ab","Type":"ContainerStarted","Data":"95abcf9145784cc9cf7b8153129d759bf254efd1256ef5b4fdd3b091278532e6"} Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.556156 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" event={"ID":"cfd94804-02a1-436d-b2a4-2fd4eb7502ab","Type":"ContainerStarted","Data":"db890c9bcc80d56340f34bdbc5a481a87720bb05132cb4515da3739635f4cddc"} Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.560875 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.589126 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jd84s" podStartSLOduration=10.58909421 podStartE2EDuration="10.58909421s" podCreationTimestamp="2025-09-30 02:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:23.581774365 +0000 UTC m=+170.754994349" watchObservedRunningTime="2025-09-30 02:57:23.58909421 +0000 UTC m=+170.762314184" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.610592 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bm469"] Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.612129 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.617383 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.627083 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bm469"] Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.631133 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.631303 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:24.131267915 +0000 UTC m=+171.304487889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.631579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvzfq\" (UniqueName: \"kubernetes.io/projected/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-kube-api-access-hvzfq\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.631867 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-catalog-content\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.631930 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-utilities\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.632023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.633183 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-utilities\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.633695 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:24.133673889 +0000 UTC m=+171.306893863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.634866 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-catalog-content\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.700677 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvzfq\" (UniqueName: \"kubernetes.io/projected/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-kube-api-access-hvzfq\") pod \"certified-operators-dj824\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.707443 4744 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T02:57:23.489276096Z","Handler":null,"Name":""} Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.720597 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.739248 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.739480 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 02:57:24.239443995 +0000 UTC m=+171.412663969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.739874 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.739957 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-utilities\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.739993 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-catalog-content\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.740038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llgd9\" (UniqueName: \"kubernetes.io/projected/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-kube-api-access-llgd9\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: E0930 02:57:23.740253 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 02:57:24.24024467 +0000 UTC m=+171.413464644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q47dv" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.749965 4744 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.750014 4744 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.798671 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mrkxm"] Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.799679 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.812102 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mrkxm"] Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.841377 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.841581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-catalog-content\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.841633 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llgd9\" (UniqueName: \"kubernetes.io/projected/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-kube-api-access-llgd9\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.841726 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-utilities\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.842234 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-utilities\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.844151 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-catalog-content\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.862178 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.868687 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llgd9\" (UniqueName: \"kubernetes.io/projected/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-kube-api-access-llgd9\") pod \"community-operators-bm469\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.926561 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.942579 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-catalog-content\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.942633 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.942689 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-utilities\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.942717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zms2x\" (UniqueName: \"kubernetes.io/projected/e5865885-5a62-4a4a-abed-d5a996d65890-kube-api-access-zms2x\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.958611 4744 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.958657 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.994044 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:23 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:23 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:23 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:23 crc kubenswrapper[4744]: I0930 02:57:23.994110 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.005881 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ks7tg"] Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.010228 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.024850 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ks7tg"] Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.031217 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q47dv\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.050807 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-catalog-content\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.050898 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-utilities\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.050935 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zms2x\" (UniqueName: \"kubernetes.io/projected/e5865885-5a62-4a4a-abed-d5a996d65890-kube-api-access-zms2x\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.051693 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-utilities\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.051849 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.052992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-catalog-content\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.064861 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj824"] Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.069085 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zms2x\" (UniqueName: \"kubernetes.io/projected/e5865885-5a62-4a4a-abed-d5a996d65890-kube-api-access-zms2x\") pod \"certified-operators-mrkxm\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.069724 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-bpp4m container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.069733 4744 patch_prober.go:28] interesting pod/downloads-7954f5f757-bpp4m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.069776 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bpp4m" podUID="a50ca402-327c-41ea-832c-15ad7932d8f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.069840 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bpp4m" podUID="a50ca402-327c-41ea-832c-15ad7932d8f5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Sep 30 02:57:24 crc kubenswrapper[4744]: W0930 02:57:24.083453 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2592cf_cbdf_4a1d_9f0d_c4b6b5665e91.slice/crio-2847ac49892ce56f5723bc19b2c11b5e35213ef341380a1a01ccf527825e0884 WatchSource:0}: Error finding container 2847ac49892ce56f5723bc19b2c11b5e35213ef341380a1a01ccf527825e0884: Status 404 returned error can't find the container with id 2847ac49892ce56f5723bc19b2c11b5e35213ef341380a1a01ccf527825e0884 Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.135195 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.153604 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9t5j\" (UniqueName: \"kubernetes.io/projected/f469d578-93cd-4537-bdc8-6c8908926457-kube-api-access-l9t5j\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.153692 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-utilities\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.153778 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-catalog-content\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.204279 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bm469"] Sep 30 02:57:24 crc kubenswrapper[4744]: W0930 02:57:24.221767 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a4ab3e_6ffc_487f_aaa7_fc255de0bbe4.slice/crio-ce0da2eb92a2cb9ab055805110e9e3bc63426cc0f7f78d1015ac7534c38da3be WatchSource:0}: Error finding container ce0da2eb92a2cb9ab055805110e9e3bc63426cc0f7f78d1015ac7534c38da3be: Status 404 returned error can't find the container with id ce0da2eb92a2cb9ab055805110e9e3bc63426cc0f7f78d1015ac7534c38da3be Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.255068 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-utilities\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.255174 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-catalog-content\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.255213 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9t5j\" (UniqueName: \"kubernetes.io/projected/f469d578-93cd-4537-bdc8-6c8908926457-kube-api-access-l9t5j\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.255966 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-utilities\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.256210 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-catalog-content\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.277497 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9t5j\" (UniqueName: \"kubernetes.io/projected/f469d578-93cd-4537-bdc8-6c8908926457-kube-api-access-l9t5j\") pod \"community-operators-ks7tg\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.315840 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q47dv"] Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.347103 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:57:24 crc kubenswrapper[4744]: W0930 02:57:24.359383 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5f1e36_fb65_446e_92df_6d6bb5cca50d.slice/crio-e7bd018b3d08396b83b9b645a2cf67a871c73b867953081a4a7d7d6c85be3ece WatchSource:0}: Error finding container e7bd018b3d08396b83b9b645a2cf67a871c73b867953081a4a7d7d6c85be3ece: Status 404 returned error can't find the container with id e7bd018b3d08396b83b9b645a2cf67a871c73b867953081a4a7d7d6c85be3ece Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.378649 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mrkxm"] Sep 30 02:57:24 crc kubenswrapper[4744]: W0930 02:57:24.390025 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5865885_5a62_4a4a_abed_d5a996d65890.slice/crio-573d59ca5b4df3c4b072d34ae106e41d02c93588c25698be44920f5aeff69608 WatchSource:0}: Error finding container 573d59ca5b4df3c4b072d34ae106e41d02c93588c25698be44920f5aeff69608: Status 404 returned error can't find the container with id 573d59ca5b4df3c4b072d34ae106e41d02c93588c25698be44920f5aeff69608 Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.538072 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ks7tg"] Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.584149 4744 generic.go:334] "Generic (PLEG): container finished" podID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerID="6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d" exitCode=0 Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.584943 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm469" event={"ID":"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4","Type":"ContainerDied","Data":"6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.584989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm469" event={"ID":"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4","Type":"ContainerStarted","Data":"ce0da2eb92a2cb9ab055805110e9e3bc63426cc0f7f78d1015ac7534c38da3be"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.590209 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.590975 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" event={"ID":"ac5f1e36-fb65-446e-92df-6d6bb5cca50d","Type":"ContainerStarted","Data":"5e19c127461a20409b4977abee7556895ac4e46683fc6bf51d89458299e66915"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.591028 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" event={"ID":"ac5f1e36-fb65-446e-92df-6d6bb5cca50d","Type":"ContainerStarted","Data":"e7bd018b3d08396b83b9b645a2cf67a871c73b867953081a4a7d7d6c85be3ece"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.591070 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.608128 4744 generic.go:334] "Generic (PLEG): container finished" podID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerID="e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea" exitCode=0 Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.608269 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj824" event={"ID":"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91","Type":"ContainerDied","Data":"e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.608305 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj824" event={"ID":"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91","Type":"ContainerStarted","Data":"2847ac49892ce56f5723bc19b2c11b5e35213ef341380a1a01ccf527825e0884"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.622654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrkxm" event={"ID":"e5865885-5a62-4a4a-abed-d5a996d65890","Type":"ContainerStarted","Data":"4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.622733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrkxm" event={"ID":"e5865885-5a62-4a4a-abed-d5a996d65890","Type":"ContainerStarted","Data":"573d59ca5b4df3c4b072d34ae106e41d02c93588c25698be44920f5aeff69608"} Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.630191 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" podStartSLOduration=146.630160226 podStartE2EDuration="2m26.630160226s" podCreationTimestamp="2025-09-30 02:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:24.627724402 +0000 UTC m=+171.800944386" watchObservedRunningTime="2025-09-30 02:57:24.630160226 +0000 UTC m=+171.803380240" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.638774 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks7tg" event={"ID":"f469d578-93cd-4537-bdc8-6c8908926457","Type":"ContainerStarted","Data":"d3df1bf5ca656b3ee7c8a0b8250e2ada6ac392772d83a6fea0663726d22860d9"} Sep 30 02:57:24 crc kubenswrapper[4744]: E0930 02:57:24.690506 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5865885_5a62_4a4a_abed_d5a996d65890.slice/crio-4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5865885_5a62_4a4a_abed_d5a996d65890.slice/crio-conmon-4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7.scope\": RecentStats: unable to find data in memory cache]" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.744059 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.746023 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.749344 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.749606 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.757116 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.868111 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.868227 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.969788 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.969875 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.970301 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.989563 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:24 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:24 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:24 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:24 crc kubenswrapper[4744]: I0930 02:57:24.989639 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:24.995348 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.156339 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.202436 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wc6fj" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.422776 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.512776 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.645707 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5865885-5a62-4a4a-abed-d5a996d65890" containerID="4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7" exitCode=0 Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.645815 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrkxm" event={"ID":"e5865885-5a62-4a4a-abed-d5a996d65890","Type":"ContainerDied","Data":"4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7"} Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.648915 4744 generic.go:334] "Generic (PLEG): container finished" podID="f469d578-93cd-4537-bdc8-6c8908926457" containerID="afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb" exitCode=0 Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.648955 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks7tg" event={"ID":"f469d578-93cd-4537-bdc8-6c8908926457","Type":"ContainerDied","Data":"afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb"} Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.655985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac","Type":"ContainerStarted","Data":"1c913ed3910b424c2fc3c503d37176d27fdb7754ecf62a1cdf9e76b05b435a1f"} Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.789954 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gcwvf"] Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.791666 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.801968 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.805317 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcwvf"] Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.886214 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-utilities\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.886342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbjj\" (UniqueName: \"kubernetes.io/projected/307311a7-837e-48b7-b54b-1830dab633a8-kube-api-access-jwbjj\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.886552 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-catalog-content\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.931936 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.932036 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.933958 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.938269 4744 patch_prober.go:28] interesting pod/console-f9d7485db-7gvsx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.938353 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7gvsx" podUID="dd70937c-9e84-468b-b81f-b9f400436aec" containerName="console" probeResult="failure" output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.946010 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xxnzf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.986507 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.988757 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbjj\" (UniqueName: \"kubernetes.io/projected/307311a7-837e-48b7-b54b-1830dab633a8-kube-api-access-jwbjj\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.988879 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-catalog-content\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.989021 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-utilities\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.989684 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-catalog-content\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.989966 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-utilities\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.991617 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:25 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:25 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:25 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:25 crc kubenswrapper[4744]: I0930 02:57:25.991740 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.036592 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbjj\" (UniqueName: \"kubernetes.io/projected/307311a7-837e-48b7-b54b-1830dab633a8-kube-api-access-jwbjj\") pod \"redhat-marketplace-gcwvf\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.146217 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.192110 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zcdq4"] Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.193677 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.220692 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcdq4"] Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.294480 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tltds\" (UniqueName: \"kubernetes.io/projected/7f186f25-71b8-4181-a625-f8e467dca6b8-kube-api-access-tltds\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.294538 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-catalog-content\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.294807 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-utilities\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.396543 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-utilities\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.396648 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tltds\" (UniqueName: \"kubernetes.io/projected/7f186f25-71b8-4181-a625-f8e467dca6b8-kube-api-access-tltds\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.396667 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-catalog-content\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.397213 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-catalog-content\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.397442 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-utilities\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.417665 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcwvf"] Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.423660 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tltds\" (UniqueName: \"kubernetes.io/projected/7f186f25-71b8-4181-a625-f8e467dca6b8-kube-api-access-tltds\") pod \"redhat-marketplace-zcdq4\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.529746 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.592361 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2p9xn"] Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.593746 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.596967 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.605316 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2p9xn"] Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.669532 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac","Type":"ContainerStarted","Data":"6d5a435e3aad99dcd90ce96e1c875e9d174431c73cb9e79b91c63178e5a204f8"} Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.672160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcwvf" event={"ID":"307311a7-837e-48b7-b54b-1830dab633a8","Type":"ContainerStarted","Data":"7b2df40ff685388e83cf961e432ee21e70489b7b9e5d76c266f1d16d61d76bac"} Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.706119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-utilities\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.706209 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-catalog-content\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.706350 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5b6\" (UniqueName: \"kubernetes.io/projected/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-kube-api-access-8w5b6\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.793002 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.792979307 podStartE2EDuration="2.792979307s" podCreationTimestamp="2025-09-30 02:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:57:26.692431701 +0000 UTC m=+173.865651675" watchObservedRunningTime="2025-09-30 02:57:26.792979307 +0000 UTC m=+173.966199281" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.795663 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcdq4"] Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.807631 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-utilities\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.807716 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-catalog-content\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.807848 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5b6\" (UniqueName: \"kubernetes.io/projected/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-kube-api-access-8w5b6\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: W0930 02:57:26.808544 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f186f25_71b8_4181_a625_f8e467dca6b8.slice/crio-db1e68e11dac2145ad08942fd172ca4bca63b1ff52ff4839ab91cc24af424f8d WatchSource:0}: Error finding container db1e68e11dac2145ad08942fd172ca4bca63b1ff52ff4839ab91cc24af424f8d: Status 404 returned error can't find the container with id db1e68e11dac2145ad08942fd172ca4bca63b1ff52ff4839ab91cc24af424f8d Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.808845 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-utilities\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.808888 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-catalog-content\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.843382 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5b6\" (UniqueName: \"kubernetes.io/projected/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-kube-api-access-8w5b6\") pod \"redhat-operators-2p9xn\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.929477 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.987005 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xwnb7"] Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.988791 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.996472 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:26 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:26 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:26 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:26 crc kubenswrapper[4744]: I0930 02:57:26.996666 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.007261 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwnb7"] Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.127883 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-utilities\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.128278 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzk4j\" (UniqueName: \"kubernetes.io/projected/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-kube-api-access-jzk4j\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.128385 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-catalog-content\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.192341 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2p9xn"] Sep 30 02:57:27 crc kubenswrapper[4744]: W0930 02:57:27.219440 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab5a9f2_aa2f_462c_8f45_38a54be2359d.slice/crio-e8ebba163667e440358384e1f22c218d04492be25eb79c302d62f04a071b7532 WatchSource:0}: Error finding container e8ebba163667e440358384e1f22c218d04492be25eb79c302d62f04a071b7532: Status 404 returned error can't find the container with id e8ebba163667e440358384e1f22c218d04492be25eb79c302d62f04a071b7532 Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.229137 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzk4j\" (UniqueName: \"kubernetes.io/projected/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-kube-api-access-jzk4j\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.229193 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-catalog-content\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.229234 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-utilities\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.229741 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-utilities\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.229888 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-catalog-content\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.276916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzk4j\" (UniqueName: \"kubernetes.io/projected/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-kube-api-access-jzk4j\") pod \"redhat-operators-xwnb7\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.356213 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.619083 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.620544 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.625973 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.630352 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.631589 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.723592 4744 generic.go:334] "Generic (PLEG): container finished" podID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerID="ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da" exitCode=0 Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.724033 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcdq4" event={"ID":"7f186f25-71b8-4181-a625-f8e467dca6b8","Type":"ContainerDied","Data":"ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da"} Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.724116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcdq4" event={"ID":"7f186f25-71b8-4181-a625-f8e467dca6b8","Type":"ContainerStarted","Data":"db1e68e11dac2145ad08942fd172ca4bca63b1ff52ff4839ab91cc24af424f8d"} Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.733049 4744 generic.go:334] "Generic (PLEG): container finished" podID="52005152-7c9b-40dc-a7f9-f72bc8e1b0ac" containerID="6d5a435e3aad99dcd90ce96e1c875e9d174431c73cb9e79b91c63178e5a204f8" exitCode=0 Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.733162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac","Type":"ContainerDied","Data":"6d5a435e3aad99dcd90ce96e1c875e9d174431c73cb9e79b91c63178e5a204f8"} Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.735667 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.735779 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.764992 4744 generic.go:334] "Generic (PLEG): container finished" podID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerID="bc46d611b0f3548e2939a11e1e1d177638efaab7038f4da4909ab02f9ed62adc" exitCode=0 Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.765115 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p9xn" event={"ID":"0ab5a9f2-aa2f-462c-8f45-38a54be2359d","Type":"ContainerDied","Data":"bc46d611b0f3548e2939a11e1e1d177638efaab7038f4da4909ab02f9ed62adc"} Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.765154 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p9xn" event={"ID":"0ab5a9f2-aa2f-462c-8f45-38a54be2359d","Type":"ContainerStarted","Data":"e8ebba163667e440358384e1f22c218d04492be25eb79c302d62f04a071b7532"} Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.790135 4744 generic.go:334] "Generic (PLEG): container finished" podID="307311a7-837e-48b7-b54b-1830dab633a8" containerID="86bd18c02b2adf2ff3d8e7c23ae8ab9f4d799a6758a6a55df0ebd2317a292a0e" exitCode=0 Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.790208 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcwvf" event={"ID":"307311a7-837e-48b7-b54b-1830dab633a8","Type":"ContainerDied","Data":"86bd18c02b2adf2ff3d8e7c23ae8ab9f4d799a6758a6a55df0ebd2317a292a0e"} Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.830284 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwnb7"] Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.839256 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.839345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.841942 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.900459 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.957578 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.992140 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:27 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:27 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:27 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:27 crc kubenswrapper[4744]: I0930 02:57:27.992209 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.202505 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 02:57:28 crc kubenswrapper[4744]: W0930 02:57:28.221799 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2a9d3f20_93f0_4844_b4c3_21380fca66b2.slice/crio-a58402b00b7accb72cb9594bb800592b2226ffc0b90dafcbf7c2b45bf1ba54a0 WatchSource:0}: Error finding container a58402b00b7accb72cb9594bb800592b2226ffc0b90dafcbf7c2b45bf1ba54a0: Status 404 returned error can't find the container with id a58402b00b7accb72cb9594bb800592b2226ffc0b90dafcbf7c2b45bf1ba54a0 Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.799240 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a9d3f20-93f0-4844-b4c3-21380fca66b2","Type":"ContainerStarted","Data":"a58402b00b7accb72cb9594bb800592b2226ffc0b90dafcbf7c2b45bf1ba54a0"} Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.808832 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" event={"ID":"b2676764-efb6-4e02-9012-74b8675e7bff","Type":"ContainerDied","Data":"3f96b830334a792bece915589388a820996eb1f7aac1428b07a7c6bdb6871a19"} Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.810521 4744 generic.go:334] "Generic (PLEG): container finished" podID="b2676764-efb6-4e02-9012-74b8675e7bff" containerID="3f96b830334a792bece915589388a820996eb1f7aac1428b07a7c6bdb6871a19" exitCode=0 Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.827684 4744 generic.go:334] "Generic (PLEG): container finished" podID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerID="83be6a63c3b354a28f7f39fbe30d41dec8c22d80a8fbff29874808f3901be34f" exitCode=0 Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.827875 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwnb7" event={"ID":"a4256d91-ff4e-42d6-a2d8-46aefd70ab57","Type":"ContainerDied","Data":"83be6a63c3b354a28f7f39fbe30d41dec8c22d80a8fbff29874808f3901be34f"} Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.827942 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwnb7" event={"ID":"a4256d91-ff4e-42d6-a2d8-46aefd70ab57","Type":"ContainerStarted","Data":"252aa14ecadd5aaed28f2c6570a5e6d25783dff4aa542f1f8dcaa19a81357bf9"} Sep 30 02:57:28 crc kubenswrapper[4744]: I0930 02:57:28.999659 4744 patch_prober.go:28] interesting pod/router-default-5444994796-r5jzt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 02:57:28 crc kubenswrapper[4744]: [-]has-synced failed: reason withheld Sep 30 02:57:28 crc kubenswrapper[4744]: [+]process-running ok Sep 30 02:57:28 crc kubenswrapper[4744]: healthz check failed Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:28.999726 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5jzt" podUID="e42f51b8-542f-4784-88cd-89832dfc1999" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.205622 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.370025 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kubelet-dir\") pod \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.370107 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kube-api-access\") pod \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\" (UID: \"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac\") " Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.372224 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "52005152-7c9b-40dc-a7f9-f72bc8e1b0ac" (UID: "52005152-7c9b-40dc-a7f9-f72bc8e1b0ac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.380481 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "52005152-7c9b-40dc-a7f9-f72bc8e1b0ac" (UID: "52005152-7c9b-40dc-a7f9-f72bc8e1b0ac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.471902 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.471945 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52005152-7c9b-40dc-a7f9-f72bc8e1b0ac-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.839812 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"52005152-7c9b-40dc-a7f9-f72bc8e1b0ac","Type":"ContainerDied","Data":"1c913ed3910b424c2fc3c503d37176d27fdb7754ecf62a1cdf9e76b05b435a1f"} Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.839859 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c913ed3910b424c2fc3c503d37176d27fdb7754ecf62a1cdf9e76b05b435a1f" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.839869 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.842920 4744 generic.go:334] "Generic (PLEG): container finished" podID="2a9d3f20-93f0-4844-b4c3-21380fca66b2" containerID="ff1d28e0d019d9f2c4c14b51dfb4627abe55f5f7f97f06030574875a3c7b9438" exitCode=0 Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.843100 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a9d3f20-93f0-4844-b4c3-21380fca66b2","Type":"ContainerDied","Data":"ff1d28e0d019d9f2c4c14b51dfb4627abe55f5f7f97f06030574875a3c7b9438"} Sep 30 02:57:29 crc kubenswrapper[4744]: I0930 02:57:29.996799 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.002804 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r5jzt" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.134531 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.286839 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2676764-efb6-4e02-9012-74b8675e7bff-secret-volume\") pod \"b2676764-efb6-4e02-9012-74b8675e7bff\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.286891 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnpxl\" (UniqueName: \"kubernetes.io/projected/b2676764-efb6-4e02-9012-74b8675e7bff-kube-api-access-nnpxl\") pod \"b2676764-efb6-4e02-9012-74b8675e7bff\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.286952 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2676764-efb6-4e02-9012-74b8675e7bff-config-volume\") pod \"b2676764-efb6-4e02-9012-74b8675e7bff\" (UID: \"b2676764-efb6-4e02-9012-74b8675e7bff\") " Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.288999 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2676764-efb6-4e02-9012-74b8675e7bff-config-volume" (OuterVolumeSpecName: "config-volume") pod "b2676764-efb6-4e02-9012-74b8675e7bff" (UID: "b2676764-efb6-4e02-9012-74b8675e7bff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.293054 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2676764-efb6-4e02-9012-74b8675e7bff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b2676764-efb6-4e02-9012-74b8675e7bff" (UID: "b2676764-efb6-4e02-9012-74b8675e7bff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.293804 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2676764-efb6-4e02-9012-74b8675e7bff-kube-api-access-nnpxl" (OuterVolumeSpecName: "kube-api-access-nnpxl") pod "b2676764-efb6-4e02-9012-74b8675e7bff" (UID: "b2676764-efb6-4e02-9012-74b8675e7bff"). InnerVolumeSpecName "kube-api-access-nnpxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.388202 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2676764-efb6-4e02-9012-74b8675e7bff-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.388237 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnpxl\" (UniqueName: \"kubernetes.io/projected/b2676764-efb6-4e02-9012-74b8675e7bff-kube-api-access-nnpxl\") on node \"crc\" DevicePath \"\"" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.388248 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2676764-efb6-4e02-9012-74b8675e7bff-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.883517 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.887448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx" event={"ID":"b2676764-efb6-4e02-9012-74b8675e7bff","Type":"ContainerDied","Data":"eae9c434ca83d18c314f4f13a2361ec2f930d0e33821052c947cf4567f861c9f"} Sep 30 02:57:30 crc kubenswrapper[4744]: I0930 02:57:30.887508 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eae9c434ca83d18c314f4f13a2361ec2f930d0e33821052c947cf4567f861c9f" Sep 30 02:57:31 crc kubenswrapper[4744]: I0930 02:57:31.472609 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zpfzk" Sep 30 02:57:34 crc kubenswrapper[4744]: I0930 02:57:34.084772 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bpp4m" Sep 30 02:57:34 crc kubenswrapper[4744]: I0930 02:57:34.347765 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 02:57:34 crc kubenswrapper[4744]: I0930 02:57:34.347978 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 02:57:35 crc kubenswrapper[4744]: I0930 02:57:35.936099 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:35 crc kubenswrapper[4744]: I0930 02:57:35.940396 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.379229 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.542991 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kubelet-dir\") pod \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.543088 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kube-api-access\") pod \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\" (UID: \"2a9d3f20-93f0-4844-b4c3-21380fca66b2\") " Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.543129 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2a9d3f20-93f0-4844-b4c3-21380fca66b2" (UID: "2a9d3f20-93f0-4844-b4c3-21380fca66b2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.543612 4744 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.551331 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2a9d3f20-93f0-4844-b4c3-21380fca66b2" (UID: "2a9d3f20-93f0-4844-b4c3-21380fca66b2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.645270 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a9d3f20-93f0-4844-b4c3-21380fca66b2-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.953811 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a9d3f20-93f0-4844-b4c3-21380fca66b2","Type":"ContainerDied","Data":"a58402b00b7accb72cb9594bb800592b2226ffc0b90dafcbf7c2b45bf1ba54a0"} Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.953862 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58402b00b7accb72cb9594bb800592b2226ffc0b90dafcbf7c2b45bf1ba54a0" Sep 30 02:57:38 crc kubenswrapper[4744]: I0930 02:57:38.953934 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 02:57:44 crc kubenswrapper[4744]: I0930 02:57:44.062930 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.016434 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.017626 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llgd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bm469_openshift-marketplace(13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.018878 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bm469" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.045824 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bm469" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.074071 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.074342 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwbjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gcwvf_openshift-marketplace(307311a7-837e-48b7-b54b-1830dab633a8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.077049 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gcwvf" podUID="307311a7-837e-48b7-b54b-1830dab633a8" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.105089 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.105288 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l9t5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ks7tg_openshift-marketplace(f469d578-93cd-4537-bdc8-6c8908926457): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.106668 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ks7tg" podUID="f469d578-93cd-4537-bdc8-6c8908926457" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.115628 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.115831 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jzk4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xwnb7_openshift-marketplace(a4256d91-ff4e-42d6-a2d8-46aefd70ab57): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 02:57:51 crc kubenswrapper[4744]: E0930 02:57:51.117259 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xwnb7" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.053485 4744 generic.go:334] "Generic (PLEG): container finished" podID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerID="38a58a100913239572592128b3dfd8393eb66b0834cf183d4d7df89edc7cab73" exitCode=0 Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.053623 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p9xn" event={"ID":"0ab5a9f2-aa2f-462c-8f45-38a54be2359d","Type":"ContainerDied","Data":"38a58a100913239572592128b3dfd8393eb66b0834cf183d4d7df89edc7cab73"} Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.058145 4744 generic.go:334] "Generic (PLEG): container finished" podID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerID="e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0" exitCode=0 Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.058245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj824" event={"ID":"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91","Type":"ContainerDied","Data":"e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0"} Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.062980 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5865885-5a62-4a4a-abed-d5a996d65890" containerID="f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458" exitCode=0 Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.063086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrkxm" event={"ID":"e5865885-5a62-4a4a-abed-d5a996d65890","Type":"ContainerDied","Data":"f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458"} Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.072818 4744 generic.go:334] "Generic (PLEG): container finished" podID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerID="df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31" exitCode=0 Sep 30 02:57:52 crc kubenswrapper[4744]: I0930 02:57:52.073184 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcdq4" event={"ID":"7f186f25-71b8-4181-a625-f8e467dca6b8","Type":"ContainerDied","Data":"df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31"} Sep 30 02:57:52 crc kubenswrapper[4744]: E0930 02:57:52.075002 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ks7tg" podUID="f469d578-93cd-4537-bdc8-6c8908926457" Sep 30 02:57:52 crc kubenswrapper[4744]: E0930 02:57:52.087750 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gcwvf" podUID="307311a7-837e-48b7-b54b-1830dab633a8" Sep 30 02:57:52 crc kubenswrapper[4744]: E0930 02:57:52.092608 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xwnb7" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.085989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj824" event={"ID":"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91","Type":"ContainerStarted","Data":"8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd"} Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.090825 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrkxm" event={"ID":"e5865885-5a62-4a4a-abed-d5a996d65890","Type":"ContainerStarted","Data":"60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9"} Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.094693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcdq4" event={"ID":"7f186f25-71b8-4181-a625-f8e467dca6b8","Type":"ContainerStarted","Data":"0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78"} Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.098252 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p9xn" event={"ID":"0ab5a9f2-aa2f-462c-8f45-38a54be2359d","Type":"ContainerStarted","Data":"c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8"} Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.162946 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mrkxm" podStartSLOduration=2.190505549 podStartE2EDuration="30.162908022s" podCreationTimestamp="2025-09-30 02:57:23 +0000 UTC" firstStartedPulling="2025-09-30 02:57:24.631120536 +0000 UTC m=+171.804340510" lastFinishedPulling="2025-09-30 02:57:52.603523009 +0000 UTC m=+199.776742983" observedRunningTime="2025-09-30 02:57:53.160505798 +0000 UTC m=+200.333725782" watchObservedRunningTime="2025-09-30 02:57:53.162908022 +0000 UTC m=+200.336128036" Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.166230 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dj824" podStartSLOduration=2.155916396 podStartE2EDuration="30.166216903s" podCreationTimestamp="2025-09-30 02:57:23 +0000 UTC" firstStartedPulling="2025-09-30 02:57:24.620009586 +0000 UTC m=+171.793229569" lastFinishedPulling="2025-09-30 02:57:52.630310102 +0000 UTC m=+199.803530076" observedRunningTime="2025-09-30 02:57:53.13066774 +0000 UTC m=+200.303887754" watchObservedRunningTime="2025-09-30 02:57:53.166216903 +0000 UTC m=+200.339436917" Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.197494 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2p9xn" podStartSLOduration=2.334684853 podStartE2EDuration="27.197472044s" podCreationTimestamp="2025-09-30 02:57:26 +0000 UTC" firstStartedPulling="2025-09-30 02:57:27.774688082 +0000 UTC m=+174.947908056" lastFinishedPulling="2025-09-30 02:57:52.637475273 +0000 UTC m=+199.810695247" observedRunningTime="2025-09-30 02:57:53.195572196 +0000 UTC m=+200.368792220" watchObservedRunningTime="2025-09-30 02:57:53.197472044 +0000 UTC m=+200.370692028" Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.226335 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zcdq4" podStartSLOduration=2.414533648 podStartE2EDuration="27.22630946s" podCreationTimestamp="2025-09-30 02:57:26 +0000 UTC" firstStartedPulling="2025-09-30 02:57:27.742660008 +0000 UTC m=+174.915879982" lastFinishedPulling="2025-09-30 02:57:52.55443579 +0000 UTC m=+199.727655794" observedRunningTime="2025-09-30 02:57:53.223819764 +0000 UTC m=+200.397039748" watchObservedRunningTime="2025-09-30 02:57:53.22630946 +0000 UTC m=+200.399529434" Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.721768 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:53 crc kubenswrapper[4744]: I0930 02:57:53.721859 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:57:54 crc kubenswrapper[4744]: I0930 02:57:54.136570 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:54 crc kubenswrapper[4744]: I0930 02:57:54.136676 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:57:54 crc kubenswrapper[4744]: I0930 02:57:54.871346 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dj824" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="registry-server" probeResult="failure" output=< Sep 30 02:57:54 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 02:57:54 crc kubenswrapper[4744]: > Sep 30 02:57:55 crc kubenswrapper[4744]: I0930 02:57:55.179636 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mrkxm" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="registry-server" probeResult="failure" output=< Sep 30 02:57:55 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 02:57:55 crc kubenswrapper[4744]: > Sep 30 02:57:56 crc kubenswrapper[4744]: I0930 02:57:56.363614 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-n9lhl" Sep 30 02:57:56 crc kubenswrapper[4744]: I0930 02:57:56.530923 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:56 crc kubenswrapper[4744]: I0930 02:57:56.531035 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:56 crc kubenswrapper[4744]: I0930 02:57:56.621205 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:56 crc kubenswrapper[4744]: I0930 02:57:56.930222 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:56 crc kubenswrapper[4744]: I0930 02:57:56.930326 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:57:57 crc kubenswrapper[4744]: I0930 02:57:57.195678 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:57:57 crc kubenswrapper[4744]: I0930 02:57:57.998581 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2p9xn" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="registry-server" probeResult="failure" output=< Sep 30 02:57:57 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 02:57:57 crc kubenswrapper[4744]: > Sep 30 02:58:00 crc kubenswrapper[4744]: I0930 02:58:00.506668 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcdq4"] Sep 30 02:58:00 crc kubenswrapper[4744]: I0930 02:58:00.509943 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zcdq4" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="registry-server" containerID="cri-o://0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78" gracePeriod=2 Sep 30 02:58:00 crc kubenswrapper[4744]: I0930 02:58:00.996282 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.137448 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-utilities\") pod \"7f186f25-71b8-4181-a625-f8e467dca6b8\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.137581 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-catalog-content\") pod \"7f186f25-71b8-4181-a625-f8e467dca6b8\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.139528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-utilities" (OuterVolumeSpecName: "utilities") pod "7f186f25-71b8-4181-a625-f8e467dca6b8" (UID: "7f186f25-71b8-4181-a625-f8e467dca6b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.145296 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tltds\" (UniqueName: \"kubernetes.io/projected/7f186f25-71b8-4181-a625-f8e467dca6b8-kube-api-access-tltds\") pod \"7f186f25-71b8-4181-a625-f8e467dca6b8\" (UID: \"7f186f25-71b8-4181-a625-f8e467dca6b8\") " Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.146433 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.155813 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f186f25-71b8-4181-a625-f8e467dca6b8" (UID: "7f186f25-71b8-4181-a625-f8e467dca6b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.156362 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f186f25-71b8-4181-a625-f8e467dca6b8-kube-api-access-tltds" (OuterVolumeSpecName: "kube-api-access-tltds") pod "7f186f25-71b8-4181-a625-f8e467dca6b8" (UID: "7f186f25-71b8-4181-a625-f8e467dca6b8"). InnerVolumeSpecName "kube-api-access-tltds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.161504 4744 generic.go:334] "Generic (PLEG): container finished" podID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerID="0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78" exitCode=0 Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.161560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcdq4" event={"ID":"7f186f25-71b8-4181-a625-f8e467dca6b8","Type":"ContainerDied","Data":"0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78"} Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.161603 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zcdq4" event={"ID":"7f186f25-71b8-4181-a625-f8e467dca6b8","Type":"ContainerDied","Data":"db1e68e11dac2145ad08942fd172ca4bca63b1ff52ff4839ab91cc24af424f8d"} Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.161627 4744 scope.go:117] "RemoveContainer" containerID="0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.161724 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zcdq4" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.205221 4744 scope.go:117] "RemoveContainer" containerID="df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.230603 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcdq4"] Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.238350 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zcdq4"] Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.239958 4744 scope.go:117] "RemoveContainer" containerID="ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.248162 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f186f25-71b8-4181-a625-f8e467dca6b8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.248208 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tltds\" (UniqueName: \"kubernetes.io/projected/7f186f25-71b8-4181-a625-f8e467dca6b8-kube-api-access-tltds\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.265056 4744 scope.go:117] "RemoveContainer" containerID="0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78" Sep 30 02:58:01 crc kubenswrapper[4744]: E0930 02:58:01.265966 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78\": container with ID starting with 0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78 not found: ID does not exist" containerID="0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.266057 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78"} err="failed to get container status \"0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78\": rpc error: code = NotFound desc = could not find container \"0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78\": container with ID starting with 0e1bcf5dc434eea3793a8ee1332e43af9fcecfd304dbd3bf25dae3a5652a8d78 not found: ID does not exist" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.266159 4744 scope.go:117] "RemoveContainer" containerID="df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31" Sep 30 02:58:01 crc kubenswrapper[4744]: E0930 02:58:01.266790 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31\": container with ID starting with df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31 not found: ID does not exist" containerID="df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.266831 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31"} err="failed to get container status \"df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31\": rpc error: code = NotFound desc = could not find container \"df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31\": container with ID starting with df6c03638a6bc68ba11e6fdf2e6664c832620c1612f02a80954e025f9ed0bb31 not found: ID does not exist" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.266858 4744 scope.go:117] "RemoveContainer" containerID="ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da" Sep 30 02:58:01 crc kubenswrapper[4744]: E0930 02:58:01.267178 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da\": container with ID starting with ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da not found: ID does not exist" containerID="ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.267211 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da"} err="failed to get container status \"ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da\": rpc error: code = NotFound desc = could not find container \"ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da\": container with ID starting with ef281054732490357048b7f57833ec94a0f4ec3b395c45138c31b64a55a3b6da not found: ID does not exist" Sep 30 02:58:01 crc kubenswrapper[4744]: I0930 02:58:01.515204 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" path="/var/lib/kubelet/pods/7f186f25-71b8-4181-a625-f8e467dca6b8/volumes" Sep 30 02:58:03 crc kubenswrapper[4744]: I0930 02:58:03.774543 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:58:03 crc kubenswrapper[4744]: I0930 02:58:03.825881 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.191454 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.253708 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.348128 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.348221 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.348282 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.349037 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.349115 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173" gracePeriod=600 Sep 30 02:58:04 crc kubenswrapper[4744]: I0930 02:58:04.565132 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mrkxm"] Sep 30 02:58:05 crc kubenswrapper[4744]: I0930 02:58:05.195783 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173" exitCode=0 Sep 30 02:58:05 crc kubenswrapper[4744]: I0930 02:58:05.196589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173"} Sep 30 02:58:05 crc kubenswrapper[4744]: I0930 02:58:05.196672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"94351d00f554b003d6de416947e6ae7daf34d530ef3993fcc2dd9dc065b4279b"} Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.206185 4744 generic.go:334] "Generic (PLEG): container finished" podID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerID="542f8975cface6cf01334845879a823c18ebe2968795cac54a55ea68fe18f9bc" exitCode=0 Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.206244 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwnb7" event={"ID":"a4256d91-ff4e-42d6-a2d8-46aefd70ab57","Type":"ContainerDied","Data":"542f8975cface6cf01334845879a823c18ebe2968795cac54a55ea68fe18f9bc"} Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.207252 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mrkxm" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="registry-server" containerID="cri-o://60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9" gracePeriod=2 Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.663444 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.746453 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-utilities\") pod \"e5865885-5a62-4a4a-abed-d5a996d65890\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.746543 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-catalog-content\") pod \"e5865885-5a62-4a4a-abed-d5a996d65890\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.746585 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zms2x\" (UniqueName: \"kubernetes.io/projected/e5865885-5a62-4a4a-abed-d5a996d65890-kube-api-access-zms2x\") pod \"e5865885-5a62-4a4a-abed-d5a996d65890\" (UID: \"e5865885-5a62-4a4a-abed-d5a996d65890\") " Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.747192 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-utilities" (OuterVolumeSpecName: "utilities") pod "e5865885-5a62-4a4a-abed-d5a996d65890" (UID: "e5865885-5a62-4a4a-abed-d5a996d65890"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.770478 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5865885-5a62-4a4a-abed-d5a996d65890-kube-api-access-zms2x" (OuterVolumeSpecName: "kube-api-access-zms2x") pod "e5865885-5a62-4a4a-abed-d5a996d65890" (UID: "e5865885-5a62-4a4a-abed-d5a996d65890"). InnerVolumeSpecName "kube-api-access-zms2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.830588 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5865885-5a62-4a4a-abed-d5a996d65890" (UID: "e5865885-5a62-4a4a-abed-d5a996d65890"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.848429 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.848476 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5865885-5a62-4a4a-abed-d5a996d65890-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.848492 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zms2x\" (UniqueName: \"kubernetes.io/projected/e5865885-5a62-4a4a-abed-d5a996d65890-kube-api-access-zms2x\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:06 crc kubenswrapper[4744]: I0930 02:58:06.976595 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.034607 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.216734 4744 generic.go:334] "Generic (PLEG): container finished" podID="e5865885-5a62-4a4a-abed-d5a996d65890" containerID="60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9" exitCode=0 Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.216815 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrkxm" event={"ID":"e5865885-5a62-4a4a-abed-d5a996d65890","Type":"ContainerDied","Data":"60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9"} Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.216865 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mrkxm" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.217172 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mrkxm" event={"ID":"e5865885-5a62-4a4a-abed-d5a996d65890","Type":"ContainerDied","Data":"573d59ca5b4df3c4b072d34ae106e41d02c93588c25698be44920f5aeff69608"} Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.217205 4744 scope.go:117] "RemoveContainer" containerID="60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.222811 4744 generic.go:334] "Generic (PLEG): container finished" podID="f469d578-93cd-4537-bdc8-6c8908926457" containerID="7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08" exitCode=0 Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.222875 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks7tg" event={"ID":"f469d578-93cd-4537-bdc8-6c8908926457","Type":"ContainerDied","Data":"7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08"} Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.228246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwnb7" event={"ID":"a4256d91-ff4e-42d6-a2d8-46aefd70ab57","Type":"ContainerStarted","Data":"b9fe734051a50150e73a025628e3c854c49720928430b638ae96d28c79c4c267"} Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.253511 4744 scope.go:117] "RemoveContainer" containerID="f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.267732 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xwnb7" podStartSLOduration=3.357714024 podStartE2EDuration="41.267705892s" podCreationTimestamp="2025-09-30 02:57:26 +0000 UTC" firstStartedPulling="2025-09-30 02:57:28.832156881 +0000 UTC m=+176.005376845" lastFinishedPulling="2025-09-30 02:58:06.742148739 +0000 UTC m=+213.915368713" observedRunningTime="2025-09-30 02:58:07.265428742 +0000 UTC m=+214.438648726" watchObservedRunningTime="2025-09-30 02:58:07.267705892 +0000 UTC m=+214.440925866" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.285792 4744 scope.go:117] "RemoveContainer" containerID="4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.288527 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mrkxm"] Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.297767 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mrkxm"] Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.309066 4744 scope.go:117] "RemoveContainer" containerID="60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9" Sep 30 02:58:07 crc kubenswrapper[4744]: E0930 02:58:07.309786 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9\": container with ID starting with 60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9 not found: ID does not exist" containerID="60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.309830 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9"} err="failed to get container status \"60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9\": rpc error: code = NotFound desc = could not find container \"60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9\": container with ID starting with 60825ef361fea881064ae5db46ef7fb606516f65d9e6cfdfa66423550b720ac9 not found: ID does not exist" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.309864 4744 scope.go:117] "RemoveContainer" containerID="f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458" Sep 30 02:58:07 crc kubenswrapper[4744]: E0930 02:58:07.310242 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458\": container with ID starting with f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458 not found: ID does not exist" containerID="f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.310269 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458"} err="failed to get container status \"f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458\": rpc error: code = NotFound desc = could not find container \"f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458\": container with ID starting with f365a4f5b18f2dc070d2059c2e1977f187408e742d29fb0b74a298e7cf0ab458 not found: ID does not exist" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.310287 4744 scope.go:117] "RemoveContainer" containerID="4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7" Sep 30 02:58:07 crc kubenswrapper[4744]: E0930 02:58:07.310746 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7\": container with ID starting with 4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7 not found: ID does not exist" containerID="4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.310769 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7"} err="failed to get container status \"4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7\": rpc error: code = NotFound desc = could not find container \"4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7\": container with ID starting with 4395e2345dddc9cc8e640030e01ff2f5ac177bc1d7263315c4311b4d9e5a86a7 not found: ID does not exist" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.357461 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.357517 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:58:07 crc kubenswrapper[4744]: I0930 02:58:07.514528 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" path="/var/lib/kubelet/pods/e5865885-5a62-4a4a-abed-d5a996d65890/volumes" Sep 30 02:58:08 crc kubenswrapper[4744]: I0930 02:58:08.239293 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcwvf" event={"ID":"307311a7-837e-48b7-b54b-1830dab633a8","Type":"ContainerStarted","Data":"5362330faf9e9c2694bea6e74c4f34914ebd3fa06b55a85573b33a1e28caa120"} Sep 30 02:58:08 crc kubenswrapper[4744]: I0930 02:58:08.242256 4744 generic.go:334] "Generic (PLEG): container finished" podID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerID="8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3" exitCode=0 Sep 30 02:58:08 crc kubenswrapper[4744]: I0930 02:58:08.242325 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm469" event={"ID":"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4","Type":"ContainerDied","Data":"8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3"} Sep 30 02:58:08 crc kubenswrapper[4744]: I0930 02:58:08.249705 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks7tg" event={"ID":"f469d578-93cd-4537-bdc8-6c8908926457","Type":"ContainerStarted","Data":"1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33"} Sep 30 02:58:08 crc kubenswrapper[4744]: I0930 02:58:08.283766 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ks7tg" podStartSLOduration=3.191196965 podStartE2EDuration="45.28374422s" podCreationTimestamp="2025-09-30 02:57:23 +0000 UTC" firstStartedPulling="2025-09-30 02:57:25.652497018 +0000 UTC m=+172.825717002" lastFinishedPulling="2025-09-30 02:58:07.745044283 +0000 UTC m=+214.918264257" observedRunningTime="2025-09-30 02:58:08.281083369 +0000 UTC m=+215.454303343" watchObservedRunningTime="2025-09-30 02:58:08.28374422 +0000 UTC m=+215.456964194" Sep 30 02:58:08 crc kubenswrapper[4744]: I0930 02:58:08.407479 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xwnb7" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="registry-server" probeResult="failure" output=< Sep 30 02:58:08 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 02:58:08 crc kubenswrapper[4744]: > Sep 30 02:58:09 crc kubenswrapper[4744]: I0930 02:58:09.258234 4744 generic.go:334] "Generic (PLEG): container finished" podID="307311a7-837e-48b7-b54b-1830dab633a8" containerID="5362330faf9e9c2694bea6e74c4f34914ebd3fa06b55a85573b33a1e28caa120" exitCode=0 Sep 30 02:58:09 crc kubenswrapper[4744]: I0930 02:58:09.258818 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcwvf" event={"ID":"307311a7-837e-48b7-b54b-1830dab633a8","Type":"ContainerDied","Data":"5362330faf9e9c2694bea6e74c4f34914ebd3fa06b55a85573b33a1e28caa120"} Sep 30 02:58:09 crc kubenswrapper[4744]: I0930 02:58:09.266140 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm469" event={"ID":"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4","Type":"ContainerStarted","Data":"93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1"} Sep 30 02:58:09 crc kubenswrapper[4744]: I0930 02:58:09.308458 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bm469" podStartSLOduration=2.242433956 podStartE2EDuration="46.308435304s" podCreationTimestamp="2025-09-30 02:57:23 +0000 UTC" firstStartedPulling="2025-09-30 02:57:24.588756475 +0000 UTC m=+171.761976449" lastFinishedPulling="2025-09-30 02:58:08.654757823 +0000 UTC m=+215.827977797" observedRunningTime="2025-09-30 02:58:09.302998656 +0000 UTC m=+216.476218630" watchObservedRunningTime="2025-09-30 02:58:09.308435304 +0000 UTC m=+216.481655288" Sep 30 02:58:10 crc kubenswrapper[4744]: I0930 02:58:10.277179 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcwvf" event={"ID":"307311a7-837e-48b7-b54b-1830dab633a8","Type":"ContainerStarted","Data":"eb23f407fa1db4ebe40c4a67e8c6b3a8669a478321ccd86c0598c7ce01626816"} Sep 30 02:58:13 crc kubenswrapper[4744]: I0930 02:58:13.927958 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:58:13 crc kubenswrapper[4744]: I0930 02:58:13.928496 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:58:13 crc kubenswrapper[4744]: I0930 02:58:13.983357 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:58:14 crc kubenswrapper[4744]: I0930 02:58:14.003261 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gcwvf" podStartSLOduration=7.17200176 podStartE2EDuration="49.003242818s" podCreationTimestamp="2025-09-30 02:57:25 +0000 UTC" firstStartedPulling="2025-09-30 02:57:27.811269004 +0000 UTC m=+174.984488978" lastFinishedPulling="2025-09-30 02:58:09.642510062 +0000 UTC m=+216.815730036" observedRunningTime="2025-09-30 02:58:10.305786687 +0000 UTC m=+217.479006661" watchObservedRunningTime="2025-09-30 02:58:14.003242818 +0000 UTC m=+221.176462792" Sep 30 02:58:14 crc kubenswrapper[4744]: I0930 02:58:14.348188 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:58:14 crc kubenswrapper[4744]: I0930 02:58:14.348257 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:58:14 crc kubenswrapper[4744]: I0930 02:58:14.349717 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:58:14 crc kubenswrapper[4744]: I0930 02:58:14.414458 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:58:14 crc kubenswrapper[4744]: I0930 02:58:14.935730 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5rwhq"] Sep 30 02:58:15 crc kubenswrapper[4744]: I0930 02:58:15.368835 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:58:16 crc kubenswrapper[4744]: I0930 02:58:16.147414 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:58:16 crc kubenswrapper[4744]: I0930 02:58:16.147483 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:58:16 crc kubenswrapper[4744]: I0930 02:58:16.185081 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:58:16 crc kubenswrapper[4744]: I0930 02:58:16.364475 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:58:17 crc kubenswrapper[4744]: I0930 02:58:17.398949 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:58:17 crc kubenswrapper[4744]: I0930 02:58:17.444505 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:58:17 crc kubenswrapper[4744]: I0930 02:58:17.701742 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ks7tg"] Sep 30 02:58:17 crc kubenswrapper[4744]: I0930 02:58:17.701955 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ks7tg" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="registry-server" containerID="cri-o://1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33" gracePeriod=2 Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.097594 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.221573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-utilities\") pod \"f469d578-93cd-4537-bdc8-6c8908926457\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.221642 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-catalog-content\") pod \"f469d578-93cd-4537-bdc8-6c8908926457\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.221674 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9t5j\" (UniqueName: \"kubernetes.io/projected/f469d578-93cd-4537-bdc8-6c8908926457-kube-api-access-l9t5j\") pod \"f469d578-93cd-4537-bdc8-6c8908926457\" (UID: \"f469d578-93cd-4537-bdc8-6c8908926457\") " Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.222528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-utilities" (OuterVolumeSpecName: "utilities") pod "f469d578-93cd-4537-bdc8-6c8908926457" (UID: "f469d578-93cd-4537-bdc8-6c8908926457"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.232069 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f469d578-93cd-4537-bdc8-6c8908926457-kube-api-access-l9t5j" (OuterVolumeSpecName: "kube-api-access-l9t5j") pod "f469d578-93cd-4537-bdc8-6c8908926457" (UID: "f469d578-93cd-4537-bdc8-6c8908926457"). InnerVolumeSpecName "kube-api-access-l9t5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.289606 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f469d578-93cd-4537-bdc8-6c8908926457" (UID: "f469d578-93cd-4537-bdc8-6c8908926457"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.323477 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.323524 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f469d578-93cd-4537-bdc8-6c8908926457-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.323537 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9t5j\" (UniqueName: \"kubernetes.io/projected/f469d578-93cd-4537-bdc8-6c8908926457-kube-api-access-l9t5j\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.334854 4744 generic.go:334] "Generic (PLEG): container finished" podID="f469d578-93cd-4537-bdc8-6c8908926457" containerID="1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33" exitCode=0 Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.335352 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks7tg" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.335934 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks7tg" event={"ID":"f469d578-93cd-4537-bdc8-6c8908926457","Type":"ContainerDied","Data":"1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33"} Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.335978 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks7tg" event={"ID":"f469d578-93cd-4537-bdc8-6c8908926457","Type":"ContainerDied","Data":"d3df1bf5ca656b3ee7c8a0b8250e2ada6ac392772d83a6fea0663726d22860d9"} Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.336003 4744 scope.go:117] "RemoveContainer" containerID="1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.373234 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ks7tg"] Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.379876 4744 scope.go:117] "RemoveContainer" containerID="7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.383465 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ks7tg"] Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.399783 4744 scope.go:117] "RemoveContainer" containerID="afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.422463 4744 scope.go:117] "RemoveContainer" containerID="1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33" Sep 30 02:58:18 crc kubenswrapper[4744]: E0930 02:58:18.427441 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33\": container with ID starting with 1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33 not found: ID does not exist" containerID="1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.427520 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33"} err="failed to get container status \"1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33\": rpc error: code = NotFound desc = could not find container \"1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33\": container with ID starting with 1cb0e0185d6e80eb45656d851617e58593886a792449409b8eff9e9a73426e33 not found: ID does not exist" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.427604 4744 scope.go:117] "RemoveContainer" containerID="7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08" Sep 30 02:58:18 crc kubenswrapper[4744]: E0930 02:58:18.428078 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08\": container with ID starting with 7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08 not found: ID does not exist" containerID="7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.428113 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08"} err="failed to get container status \"7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08\": rpc error: code = NotFound desc = could not find container \"7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08\": container with ID starting with 7fea6e3266a5faf4ebbb8402f2adfc69deecdfd6101ddb6650a44be935420b08 not found: ID does not exist" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.428128 4744 scope.go:117] "RemoveContainer" containerID="afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb" Sep 30 02:58:18 crc kubenswrapper[4744]: E0930 02:58:18.428565 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb\": container with ID starting with afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb not found: ID does not exist" containerID="afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb" Sep 30 02:58:18 crc kubenswrapper[4744]: I0930 02:58:18.428625 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb"} err="failed to get container status \"afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb\": rpc error: code = NotFound desc = could not find container \"afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb\": container with ID starting with afd2d138d13643270648da87f2151ee1bfcef02e4b1bd80f06d1f71cccca52eb not found: ID does not exist" Sep 30 02:58:19 crc kubenswrapper[4744]: I0930 02:58:19.516754 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f469d578-93cd-4537-bdc8-6c8908926457" path="/var/lib/kubelet/pods/f469d578-93cd-4537-bdc8-6c8908926457/volumes" Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.103588 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwnb7"] Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.104081 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xwnb7" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="registry-server" containerID="cri-o://b9fe734051a50150e73a025628e3c854c49720928430b638ae96d28c79c4c267" gracePeriod=2 Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.359606 4744 generic.go:334] "Generic (PLEG): container finished" podID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerID="b9fe734051a50150e73a025628e3c854c49720928430b638ae96d28c79c4c267" exitCode=0 Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.359727 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwnb7" event={"ID":"a4256d91-ff4e-42d6-a2d8-46aefd70ab57","Type":"ContainerDied","Data":"b9fe734051a50150e73a025628e3c854c49720928430b638ae96d28c79c4c267"} Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.465575 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.655162 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-utilities\") pod \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.655306 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzk4j\" (UniqueName: \"kubernetes.io/projected/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-kube-api-access-jzk4j\") pod \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.655362 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-catalog-content\") pod \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\" (UID: \"a4256d91-ff4e-42d6-a2d8-46aefd70ab57\") " Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.656112 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-utilities" (OuterVolumeSpecName: "utilities") pod "a4256d91-ff4e-42d6-a2d8-46aefd70ab57" (UID: "a4256d91-ff4e-42d6-a2d8-46aefd70ab57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.664767 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-kube-api-access-jzk4j" (OuterVolumeSpecName: "kube-api-access-jzk4j") pod "a4256d91-ff4e-42d6-a2d8-46aefd70ab57" (UID: "a4256d91-ff4e-42d6-a2d8-46aefd70ab57"). InnerVolumeSpecName "kube-api-access-jzk4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.752206 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4256d91-ff4e-42d6-a2d8-46aefd70ab57" (UID: "a4256d91-ff4e-42d6-a2d8-46aefd70ab57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.756864 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.756892 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzk4j\" (UniqueName: \"kubernetes.io/projected/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-kube-api-access-jzk4j\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:20 crc kubenswrapper[4744]: I0930 02:58:20.756905 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4256d91-ff4e-42d6-a2d8-46aefd70ab57-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.369643 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwnb7" event={"ID":"a4256d91-ff4e-42d6-a2d8-46aefd70ab57","Type":"ContainerDied","Data":"252aa14ecadd5aaed28f2c6570a5e6d25783dff4aa542f1f8dcaa19a81357bf9"} Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.369704 4744 scope.go:117] "RemoveContainer" containerID="b9fe734051a50150e73a025628e3c854c49720928430b638ae96d28c79c4c267" Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.369753 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwnb7" Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.395718 4744 scope.go:117] "RemoveContainer" containerID="542f8975cface6cf01334845879a823c18ebe2968795cac54a55ea68fe18f9bc" Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.407449 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwnb7"] Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.409927 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xwnb7"] Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.437727 4744 scope.go:117] "RemoveContainer" containerID="83be6a63c3b354a28f7f39fbe30d41dec8c22d80a8fbff29874808f3901be34f" Sep 30 02:58:21 crc kubenswrapper[4744]: I0930 02:58:21.511199 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" path="/var/lib/kubelet/pods/a4256d91-ff4e-42d6-a2d8-46aefd70ab57/volumes" Sep 30 02:58:39 crc kubenswrapper[4744]: I0930 02:58:39.971294 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" podUID="16046530-d8fe-40bb-9a22-2a021648faa9" containerName="oauth-openshift" containerID="cri-o://695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93" gracePeriod=15 Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.399648 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448110 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58444664d6-48fxg"] Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448412 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448431 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448446 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448452 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448464 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448471 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448479 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448486 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448495 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448501 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448510 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9d3f20-93f0-4844-b4c3-21380fca66b2" containerName="pruner" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448516 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9d3f20-93f0-4844-b4c3-21380fca66b2" containerName="pruner" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448526 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448532 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448540 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16046530-d8fe-40bb-9a22-2a021648faa9" containerName="oauth-openshift" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448546 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="16046530-d8fe-40bb-9a22-2a021648faa9" containerName="oauth-openshift" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448557 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52005152-7c9b-40dc-a7f9-f72bc8e1b0ac" containerName="pruner" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448565 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="52005152-7c9b-40dc-a7f9-f72bc8e1b0ac" containerName="pruner" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448574 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448580 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448589 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448596 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="extract-content" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448605 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448612 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448622 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448629 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448639 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2676764-efb6-4e02-9012-74b8675e7bff" containerName="collect-profiles" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448693 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2676764-efb6-4e02-9012-74b8675e7bff" containerName="collect-profiles" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448704 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448711 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.448718 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448725 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="extract-utilities" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448824 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="52005152-7c9b-40dc-a7f9-f72bc8e1b0ac" containerName="pruner" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448844 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a9d3f20-93f0-4844-b4c3-21380fca66b2" containerName="pruner" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448855 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f469d578-93cd-4537-bdc8-6c8908926457" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448866 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f186f25-71b8-4181-a625-f8e467dca6b8" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448877 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="16046530-d8fe-40bb-9a22-2a021648faa9" containerName="oauth-openshift" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448886 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4256d91-ff4e-42d6-a2d8-46aefd70ab57" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448895 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2676764-efb6-4e02-9012-74b8675e7bff" containerName="collect-profiles" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.448904 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5865885-5a62-4a4a-abed-d5a996d65890" containerName="registry-server" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.449349 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.465320 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58444664d6-48fxg"] Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.523516 4744 generic.go:334] "Generic (PLEG): container finished" podID="16046530-d8fe-40bb-9a22-2a021648faa9" containerID="695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93" exitCode=0 Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.523592 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" event={"ID":"16046530-d8fe-40bb-9a22-2a021648faa9","Type":"ContainerDied","Data":"695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93"} Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.523664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" event={"ID":"16046530-d8fe-40bb-9a22-2a021648faa9","Type":"ContainerDied","Data":"c2e0a9fdb0eb0d4cb73e2b4705ce2d5a53748626238ccb46f10aeef553ade428"} Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.523701 4744 scope.go:117] "RemoveContainer" containerID="695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.523616 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5rwhq" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.548615 4744 scope.go:117] "RemoveContainer" containerID="695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93" Sep 30 02:58:40 crc kubenswrapper[4744]: E0930 02:58:40.549293 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93\": container with ID starting with 695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93 not found: ID does not exist" containerID="695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.549357 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93"} err="failed to get container status \"695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93\": rpc error: code = NotFound desc = could not find container \"695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93\": container with ID starting with 695bdc95065e926e11ac27bd8d3ef89aa7a202bd915e4c3217c7ef84b395ad93 not found: ID does not exist" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571413 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qkk9\" (UniqueName: \"kubernetes.io/projected/16046530-d8fe-40bb-9a22-2a021648faa9-kube-api-access-2qkk9\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571529 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-cliconfig\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571591 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16046530-d8fe-40bb-9a22-2a021648faa9-audit-dir\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571626 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-idp-0-file-data\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571658 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-provider-selection\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571696 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-service-ca\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571734 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-trusted-ca-bundle\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571770 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-serving-cert\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571803 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-error\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571867 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-ocp-branding-template\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571913 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-router-certs\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571952 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16046530-d8fe-40bb-9a22-2a021648faa9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.571964 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-login\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572011 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-audit-policies\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572355 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-session\") pod \"16046530-d8fe-40bb-9a22-2a021648faa9\" (UID: \"16046530-d8fe-40bb-9a22-2a021648faa9\") " Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572604 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572659 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572685 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-error\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572732 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572767 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572861 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-session\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572932 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572935 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.572969 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573001 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-audit-dir\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp5kr\" (UniqueName: \"kubernetes.io/projected/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-kube-api-access-dp5kr\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573094 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-login\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573128 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-audit-policies\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573187 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573207 4744 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16046530-d8fe-40bb-9a22-2a021648faa9-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573447 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.573437 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.574193 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.579759 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.580343 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.580550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.580552 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16046530-d8fe-40bb-9a22-2a021648faa9-kube-api-access-2qkk9" (OuterVolumeSpecName: "kube-api-access-2qkk9") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "kube-api-access-2qkk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.581175 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.581180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.581449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.582579 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.583744 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "16046530-d8fe-40bb-9a22-2a021648faa9" (UID: "16046530-d8fe-40bb-9a22-2a021648faa9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674353 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674524 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-session\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674566 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674628 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674661 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674679 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-audit-dir\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp5kr\" (UniqueName: \"kubernetes.io/projected/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-kube-api-access-dp5kr\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674746 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-login\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674805 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-audit-policies\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674863 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674883 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.674989 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675020 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-error\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675075 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675088 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675101 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675111 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675122 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675131 4744 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675143 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675155 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qkk9\" (UniqueName: \"kubernetes.io/projected/16046530-d8fe-40bb-9a22-2a021648faa9-kube-api-access-2qkk9\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675167 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675179 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675190 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.675202 4744 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16046530-d8fe-40bb-9a22-2a021648faa9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.676483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.676576 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-audit-dir\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.677474 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.678471 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.679471 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-audit-policies\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.680258 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.681004 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.681133 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-login\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.683428 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.683702 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-session\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.684931 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-user-template-error\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.686882 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.687763 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.695356 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp5kr\" (UniqueName: \"kubernetes.io/projected/73ee3975-b284-43db-b7f9-9c6d22b2a3dc-kube-api-access-dp5kr\") pod \"oauth-openshift-58444664d6-48fxg\" (UID: \"73ee3975-b284-43db-b7f9-9c6d22b2a3dc\") " pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.783508 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.872301 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5rwhq"] Sep 30 02:58:40 crc kubenswrapper[4744]: I0930 02:58:40.878418 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5rwhq"] Sep 30 02:58:41 crc kubenswrapper[4744]: I0930 02:58:41.034774 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58444664d6-48fxg"] Sep 30 02:58:41 crc kubenswrapper[4744]: I0930 02:58:41.511487 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16046530-d8fe-40bb-9a22-2a021648faa9" path="/var/lib/kubelet/pods/16046530-d8fe-40bb-9a22-2a021648faa9/volumes" Sep 30 02:58:41 crc kubenswrapper[4744]: I0930 02:58:41.532101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" event={"ID":"73ee3975-b284-43db-b7f9-9c6d22b2a3dc","Type":"ContainerStarted","Data":"d34704ffc4ac3783331247fe6a9ae75375d81f3b906605daa89451d46ef9186c"} Sep 30 02:58:41 crc kubenswrapper[4744]: I0930 02:58:41.532271 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" event={"ID":"73ee3975-b284-43db-b7f9-9c6d22b2a3dc","Type":"ContainerStarted","Data":"6551a2b47e466669a3fd45dfaabfa49bfeacb723c991f60f7ad00d518312156a"} Sep 30 02:58:41 crc kubenswrapper[4744]: I0930 02:58:41.532630 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:41 crc kubenswrapper[4744]: I0930 02:58:41.869794 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" Sep 30 02:58:41 crc kubenswrapper[4744]: I0930 02:58:41.892947 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58444664d6-48fxg" podStartSLOduration=27.892919989 podStartE2EDuration="27.892919989s" podCreationTimestamp="2025-09-30 02:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:58:41.555677364 +0000 UTC m=+248.728897418" watchObservedRunningTime="2025-09-30 02:58:41.892919989 +0000 UTC m=+249.066139963" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.615231 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.616121 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.617490 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.618468 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.627159 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.635401 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.716542 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.717186 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.719083 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.730153 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.736167 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.754674 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.757870 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:59:02 crc kubenswrapper[4744]: I0930 02:59:02.768421 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.056199 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 02:59:03 crc kubenswrapper[4744]: W0930 02:59:03.334614 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-b61c16d552dc5f22ebf1c8c2f5562e1b696299301815656b6eab237733c94170 WatchSource:0}: Error finding container b61c16d552dc5f22ebf1c8c2f5562e1b696299301815656b6eab237733c94170: Status 404 returned error can't find the container with id b61c16d552dc5f22ebf1c8c2f5562e1b696299301815656b6eab237733c94170 Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.695738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bfb677fb9498cde12d73cb96a0c27c0d7299da274c4b1b84f465fad0dc516244"} Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.695808 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b61c16d552dc5f22ebf1c8c2f5562e1b696299301815656b6eab237733c94170"} Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.707279 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"22c6c2884e10bab4e1cf168f6a6235170019cb18220f48001a3fd1ff4c3289d7"} Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.707568 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"08d686c20d1b9b33cb2330fd52bcaa61417d59bbe6444c20c9320012230694ba"} Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.718944 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"534773d7cfe00d043d638272ddb23ebf256b07fccadb4e5cbb32ae19b1bc9273"} Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.719020 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"98b1abcb3c20778321d75db09d61089f49aed7f8575d584a9c7b0a3b812d5c6d"} Sep 30 02:59:03 crc kubenswrapper[4744]: I0930 02:59:03.721893 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.364638 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj824"] Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.365918 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dj824" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="registry-server" containerID="cri-o://8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd" gracePeriod=30 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.373910 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bm469"] Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.374268 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bm469" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="registry-server" containerID="cri-o://93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1" gracePeriod=30 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.387458 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5rlq"] Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.387729 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" podUID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" containerName="marketplace-operator" containerID="cri-o://8999f8e3488365ece7a71dbcc74dff5974c59490364a69cbbbd0e9f7d1ae0b50" gracePeriod=30 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.401753 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcwvf"] Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.402139 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gcwvf" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="registry-server" containerID="cri-o://eb23f407fa1db4ebe40c4a67e8c6b3a8669a478321ccd86c0598c7ce01626816" gracePeriod=30 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.404326 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2p9xn"] Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.404703 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2p9xn" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="registry-server" containerID="cri-o://c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8" gracePeriod=30 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.413620 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wkhkg"] Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.414642 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.425524 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wkhkg"] Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.515559 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l969\" (UniqueName: \"kubernetes.io/projected/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-kube-api-access-9l969\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.515652 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.515799 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.616740 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.616792 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.616874 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l969\" (UniqueName: \"kubernetes.io/projected/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-kube-api-access-9l969\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.618881 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.629298 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.638972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l969\" (UniqueName: \"kubernetes.io/projected/fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280-kube-api-access-9l969\") pod \"marketplace-operator-79b997595-wkhkg\" (UID: \"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280\") " pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.822177 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.869566 4744 generic.go:334] "Generic (PLEG): container finished" podID="307311a7-837e-48b7-b54b-1830dab633a8" containerID="eb23f407fa1db4ebe40c4a67e8c6b3a8669a478321ccd86c0598c7ce01626816" exitCode=0 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.869637 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcwvf" event={"ID":"307311a7-837e-48b7-b54b-1830dab633a8","Type":"ContainerDied","Data":"eb23f407fa1db4ebe40c4a67e8c6b3a8669a478321ccd86c0598c7ce01626816"} Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.873416 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.879107 4744 generic.go:334] "Generic (PLEG): container finished" podID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerID="93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1" exitCode=0 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.879271 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm469" event={"ID":"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4","Type":"ContainerDied","Data":"93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1"} Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.879329 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm469" event={"ID":"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4","Type":"ContainerDied","Data":"ce0da2eb92a2cb9ab055805110e9e3bc63426cc0f7f78d1015ac7534c38da3be"} Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.879351 4744 scope.go:117] "RemoveContainer" containerID="93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.887651 4744 generic.go:334] "Generic (PLEG): container finished" podID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" containerID="8999f8e3488365ece7a71dbcc74dff5974c59490364a69cbbbd0e9f7d1ae0b50" exitCode=0 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.887687 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" event={"ID":"fd4c491e-16e3-4e31-a4a9-314d53ceada8","Type":"ContainerDied","Data":"8999f8e3488365ece7a71dbcc74dff5974c59490364a69cbbbd0e9f7d1ae0b50"} Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.888853 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.891771 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.891812 4744 generic.go:334] "Generic (PLEG): container finished" podID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerID="8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd" exitCode=0 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.891841 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj824" event={"ID":"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91","Type":"ContainerDied","Data":"8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd"} Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.891865 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj824" event={"ID":"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91","Type":"ContainerDied","Data":"2847ac49892ce56f5723bc19b2c11b5e35213ef341380a1a01ccf527825e0884"} Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.894916 4744 generic.go:334] "Generic (PLEG): container finished" podID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerID="c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8" exitCode=0 Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.894941 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p9xn" event={"ID":"0ab5a9f2-aa2f-462c-8f45-38a54be2359d","Type":"ContainerDied","Data":"c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8"} Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.927077 4744 scope.go:117] "RemoveContainer" containerID="8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3" Sep 30 02:59:16 crc kubenswrapper[4744]: E0930 02:59:16.933046 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8 is running failed: container process not found" containerID="c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 02:59:16 crc kubenswrapper[4744]: E0930 02:59:16.936833 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8 is running failed: container process not found" containerID="c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 02:59:16 crc kubenswrapper[4744]: E0930 02:59:16.937292 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8 is running failed: container process not found" containerID="c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 02:59:16 crc kubenswrapper[4744]: E0930 02:59:16.937400 4744 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-2p9xn" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="registry-server" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.940880 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.946879 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.953106 4744 scope.go:117] "RemoveContainer" containerID="6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.980073 4744 scope.go:117] "RemoveContainer" containerID="93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1" Sep 30 02:59:16 crc kubenswrapper[4744]: E0930 02:59:16.980661 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1\": container with ID starting with 93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1 not found: ID does not exist" containerID="93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.980692 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1"} err="failed to get container status \"93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1\": rpc error: code = NotFound desc = could not find container \"93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1\": container with ID starting with 93debdcbe944cd1900beb5ed7e984bf96d5a0f4dd129ff50a61092de1de722c1 not found: ID does not exist" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.980713 4744 scope.go:117] "RemoveContainer" containerID="8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3" Sep 30 02:59:16 crc kubenswrapper[4744]: E0930 02:59:16.980991 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3\": container with ID starting with 8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3 not found: ID does not exist" containerID="8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.981013 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3"} err="failed to get container status \"8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3\": rpc error: code = NotFound desc = could not find container \"8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3\": container with ID starting with 8430220a83e33d38e67201409c53d301e45cd58585db7bde9a73f2576bd855c3 not found: ID does not exist" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.981026 4744 scope.go:117] "RemoveContainer" containerID="6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d" Sep 30 02:59:16 crc kubenswrapper[4744]: E0930 02:59:16.981192 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d\": container with ID starting with 6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d not found: ID does not exist" containerID="6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.981215 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d"} err="failed to get container status \"6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d\": rpc error: code = NotFound desc = could not find container \"6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d\": container with ID starting with 6e362f439ffa8d739f730241d51c92ce66c66c3ed37701822c519d9caefb323d not found: ID does not exist" Sep 30 02:59:16 crc kubenswrapper[4744]: I0930 02:59:16.981229 4744 scope.go:117] "RemoveContainer" containerID="8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.015252 4744 scope.go:117] "RemoveContainer" containerID="e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021318 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-catalog-content\") pod \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021360 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwbjj\" (UniqueName: \"kubernetes.io/projected/307311a7-837e-48b7-b54b-1830dab633a8-kube-api-access-jwbjj\") pod \"307311a7-837e-48b7-b54b-1830dab633a8\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021427 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-operator-metrics\") pod \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021456 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5b6\" (UniqueName: \"kubernetes.io/projected/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-kube-api-access-8w5b6\") pod \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021478 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-catalog-content\") pod \"307311a7-837e-48b7-b54b-1830dab633a8\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021498 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-trusted-ca\") pod \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021514 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llgd9\" (UniqueName: \"kubernetes.io/projected/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-kube-api-access-llgd9\") pod \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021540 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-utilities\") pod \"307311a7-837e-48b7-b54b-1830dab633a8\" (UID: \"307311a7-837e-48b7-b54b-1830dab633a8\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021563 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-utilities\") pod \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\" (UID: \"0ab5a9f2-aa2f-462c-8f45-38a54be2359d\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021581 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-utilities\") pod \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021598 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4tvc\" (UniqueName: \"kubernetes.io/projected/fd4c491e-16e3-4e31-a4a9-314d53ceada8-kube-api-access-c4tvc\") pod \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\" (UID: \"fd4c491e-16e3-4e31-a4a9-314d53ceada8\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021615 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-utilities\") pod \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021645 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvzfq\" (UniqueName: \"kubernetes.io/projected/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-kube-api-access-hvzfq\") pod \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021671 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-catalog-content\") pod \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\" (UID: \"0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.021689 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-catalog-content\") pod \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\" (UID: \"13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4\") " Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.023192 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "fd4c491e-16e3-4e31-a4a9-314d53ceada8" (UID: "fd4c491e-16e3-4e31-a4a9-314d53ceada8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.023796 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-utilities" (OuterVolumeSpecName: "utilities") pod "13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" (UID: "13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.024454 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-utilities" (OuterVolumeSpecName: "utilities") pod "0ab5a9f2-aa2f-462c-8f45-38a54be2359d" (UID: "0ab5a9f2-aa2f-462c-8f45-38a54be2359d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.025224 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-utilities" (OuterVolumeSpecName: "utilities") pod "0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" (UID: "0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.027892 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-utilities" (OuterVolumeSpecName: "utilities") pod "307311a7-837e-48b7-b54b-1830dab633a8" (UID: "307311a7-837e-48b7-b54b-1830dab633a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.032049 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4c491e-16e3-4e31-a4a9-314d53ceada8-kube-api-access-c4tvc" (OuterVolumeSpecName: "kube-api-access-c4tvc") pod "fd4c491e-16e3-4e31-a4a9-314d53ceada8" (UID: "fd4c491e-16e3-4e31-a4a9-314d53ceada8"). InnerVolumeSpecName "kube-api-access-c4tvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.033528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-kube-api-access-hvzfq" (OuterVolumeSpecName: "kube-api-access-hvzfq") pod "0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" (UID: "0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91"). InnerVolumeSpecName "kube-api-access-hvzfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.037199 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "307311a7-837e-48b7-b54b-1830dab633a8" (UID: "307311a7-837e-48b7-b54b-1830dab633a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.038964 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-kube-api-access-8w5b6" (OuterVolumeSpecName: "kube-api-access-8w5b6") pod "0ab5a9f2-aa2f-462c-8f45-38a54be2359d" (UID: "0ab5a9f2-aa2f-462c-8f45-38a54be2359d"). InnerVolumeSpecName "kube-api-access-8w5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.039107 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307311a7-837e-48b7-b54b-1830dab633a8-kube-api-access-jwbjj" (OuterVolumeSpecName: "kube-api-access-jwbjj") pod "307311a7-837e-48b7-b54b-1830dab633a8" (UID: "307311a7-837e-48b7-b54b-1830dab633a8"). InnerVolumeSpecName "kube-api-access-jwbjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.041596 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-kube-api-access-llgd9" (OuterVolumeSpecName: "kube-api-access-llgd9") pod "13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" (UID: "13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4"). InnerVolumeSpecName "kube-api-access-llgd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.042281 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "fd4c491e-16e3-4e31-a4a9-314d53ceada8" (UID: "fd4c491e-16e3-4e31-a4a9-314d53ceada8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.049970 4744 scope.go:117] "RemoveContainer" containerID="e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.070811 4744 scope.go:117] "RemoveContainer" containerID="8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd" Sep 30 02:59:17 crc kubenswrapper[4744]: E0930 02:59:17.082150 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd\": container with ID starting with 8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd not found: ID does not exist" containerID="8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.082202 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd"} err="failed to get container status \"8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd\": rpc error: code = NotFound desc = could not find container \"8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd\": container with ID starting with 8c27fbebdc87ba564eb7b54c85c004c28ea0e3bd05fdef263e02c9dd3db36bcd not found: ID does not exist" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.082240 4744 scope.go:117] "RemoveContainer" containerID="e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0" Sep 30 02:59:17 crc kubenswrapper[4744]: E0930 02:59:17.084889 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0\": container with ID starting with e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0 not found: ID does not exist" containerID="e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.084993 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0"} err="failed to get container status \"e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0\": rpc error: code = NotFound desc = could not find container \"e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0\": container with ID starting with e89f3740212568d12bca052b7733c3b49dd200455aab8d6de71fb11352feedc0 not found: ID does not exist" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.085048 4744 scope.go:117] "RemoveContainer" containerID="e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea" Sep 30 02:59:17 crc kubenswrapper[4744]: E0930 02:59:17.085699 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea\": container with ID starting with e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea not found: ID does not exist" containerID="e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.085740 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea"} err="failed to get container status \"e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea\": rpc error: code = NotFound desc = could not find container \"e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea\": container with ID starting with e4864d162de6f02de5e411aa40acdcbacfb7e611ea6a298e87a6bb8b6d68f5ea not found: ID does not exist" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.102119 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" (UID: "13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.104329 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" (UID: "0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123429 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123474 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5b6\" (UniqueName: \"kubernetes.io/projected/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-kube-api-access-8w5b6\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123487 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123495 4744 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd4c491e-16e3-4e31-a4a9-314d53ceada8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123503 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llgd9\" (UniqueName: \"kubernetes.io/projected/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-kube-api-access-llgd9\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123512 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307311a7-837e-48b7-b54b-1830dab633a8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123524 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123532 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123542 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4tvc\" (UniqueName: \"kubernetes.io/projected/fd4c491e-16e3-4e31-a4a9-314d53ceada8-kube-api-access-c4tvc\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123552 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123561 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvzfq\" (UniqueName: \"kubernetes.io/projected/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-kube-api-access-hvzfq\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123571 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123580 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.123588 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwbjj\" (UniqueName: \"kubernetes.io/projected/307311a7-837e-48b7-b54b-1830dab633a8-kube-api-access-jwbjj\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.159752 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ab5a9f2-aa2f-462c-8f45-38a54be2359d" (UID: "0ab5a9f2-aa2f-462c-8f45-38a54be2359d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.224477 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab5a9f2-aa2f-462c-8f45-38a54be2359d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.312304 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wkhkg"] Sep 30 02:59:17 crc kubenswrapper[4744]: W0930 02:59:17.319827 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe8983ae_8985_4ff0_8fbe_8ab1b8bb4280.slice/crio-2cade2479875629cf8e91906aba704b7d33bf5024ee9b1117b806294955ce500 WatchSource:0}: Error finding container 2cade2479875629cf8e91906aba704b7d33bf5024ee9b1117b806294955ce500: Status 404 returned error can't find the container with id 2cade2479875629cf8e91906aba704b7d33bf5024ee9b1117b806294955ce500 Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.901116 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj824" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.903892 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p9xn" event={"ID":"0ab5a9f2-aa2f-462c-8f45-38a54be2359d","Type":"ContainerDied","Data":"e8ebba163667e440358384e1f22c218d04492be25eb79c302d62f04a071b7532"} Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.903934 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p9xn" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.903973 4744 scope.go:117] "RemoveContainer" containerID="c1b1fb83d7e3ad5bb7bd39b424012b98d1708f974c7378384701180ca22006c8" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.907050 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gcwvf" event={"ID":"307311a7-837e-48b7-b54b-1830dab633a8","Type":"ContainerDied","Data":"7b2df40ff685388e83cf961e432ee21e70489b7b9e5d76c266f1d16d61d76bac"} Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.907136 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gcwvf" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.908578 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" event={"ID":"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280","Type":"ContainerStarted","Data":"58562a2ff688a931e20903b64f8c765effbcdcb7837be60dd2c95bdfa1af935b"} Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.908602 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" event={"ID":"fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280","Type":"ContainerStarted","Data":"2cade2479875629cf8e91906aba704b7d33bf5024ee9b1117b806294955ce500"} Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.909070 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.909793 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm469" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.912003 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" event={"ID":"fd4c491e-16e3-4e31-a4a9-314d53ceada8","Type":"ContainerDied","Data":"823641158deffaee96c404bc5b914f8160f4fdb36953bbbd4b7f0c74d293813c"} Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.912046 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h5rlq" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.916523 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.922937 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj824"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.925424 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dj824"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.925599 4744 scope.go:117] "RemoveContainer" containerID="38a58a100913239572592128b3dfd8393eb66b0834cf183d4d7df89edc7cab73" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.936025 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2p9xn"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.943349 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2p9xn"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.953856 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5rlq"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.962062 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h5rlq"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.963175 4744 scope.go:117] "RemoveContainer" containerID="bc46d611b0f3548e2939a11e1e1d177638efaab7038f4da4909ab02f9ed62adc" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.984139 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bm469"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.986148 4744 scope.go:117] "RemoveContainer" containerID="eb23f407fa1db4ebe40c4a67e8c6b3a8669a478321ccd86c0598c7ce01626816" Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.987431 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bm469"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.993295 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcwvf"] Sep 30 02:59:17 crc kubenswrapper[4744]: I0930 02:59:17.996575 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gcwvf"] Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.001664 4744 scope.go:117] "RemoveContainer" containerID="5362330faf9e9c2694bea6e74c4f34914ebd3fa06b55a85573b33a1e28caa120" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.010678 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wkhkg" podStartSLOduration=2.010660357 podStartE2EDuration="2.010660357s" podCreationTimestamp="2025-09-30 02:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 02:59:18.010099331 +0000 UTC m=+285.183319305" watchObservedRunningTime="2025-09-30 02:59:18.010660357 +0000 UTC m=+285.183880331" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.022330 4744 scope.go:117] "RemoveContainer" containerID="86bd18c02b2adf2ff3d8e7c23ae8ab9f4d799a6758a6a55df0ebd2317a292a0e" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.054925 4744 scope.go:117] "RemoveContainer" containerID="8999f8e3488365ece7a71dbcc74dff5974c59490364a69cbbbd0e9f7d1ae0b50" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.586646 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ts2v8"] Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587317 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587336 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587356 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587388 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587406 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587420 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587438 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" containerName="marketplace-operator" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587450 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" containerName="marketplace-operator" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587467 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587479 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587495 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587506 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587524 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587536 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587555 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587567 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587584 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587599 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587620 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587632 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587651 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587663 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587678 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587690 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="extract-utilities" Sep 30 02:59:18 crc kubenswrapper[4744]: E0930 02:59:18.587711 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587723 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="extract-content" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587885 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587911 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" containerName="marketplace-operator" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587930 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587945 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.587962 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="307311a7-837e-48b7-b54b-1830dab633a8" containerName="registry-server" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.589152 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.593448 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.600540 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts2v8"] Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.754715 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23313229-2e34-4cf4-988e-e273962bec95-catalog-content\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.755070 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23313229-2e34-4cf4-988e-e273962bec95-utilities\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.755218 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z29ql\" (UniqueName: \"kubernetes.io/projected/23313229-2e34-4cf4-988e-e273962bec95-kube-api-access-z29ql\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.788636 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zpn7m"] Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.792151 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.795993 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.808822 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpn7m"] Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.857164 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23313229-2e34-4cf4-988e-e273962bec95-utilities\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.857684 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z29ql\" (UniqueName: \"kubernetes.io/projected/23313229-2e34-4cf4-988e-e273962bec95-kube-api-access-z29ql\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.857805 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23313229-2e34-4cf4-988e-e273962bec95-catalog-content\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.858057 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23313229-2e34-4cf4-988e-e273962bec95-utilities\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.858562 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23313229-2e34-4cf4-988e-e273962bec95-catalog-content\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.882078 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z29ql\" (UniqueName: \"kubernetes.io/projected/23313229-2e34-4cf4-988e-e273962bec95-kube-api-access-z29ql\") pod \"redhat-marketplace-ts2v8\" (UID: \"23313229-2e34-4cf4-988e-e273962bec95\") " pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.918742 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.959590 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whp46\" (UniqueName: \"kubernetes.io/projected/f06dd885-034b-4e39-bb3b-689087c8a26c-kube-api-access-whp46\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.959763 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f06dd885-034b-4e39-bb3b-689087c8a26c-catalog-content\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:18 crc kubenswrapper[4744]: I0930 02:59:18.959919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f06dd885-034b-4e39-bb3b-689087c8a26c-utilities\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.061462 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f06dd885-034b-4e39-bb3b-689087c8a26c-utilities\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.061585 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whp46\" (UniqueName: \"kubernetes.io/projected/f06dd885-034b-4e39-bb3b-689087c8a26c-kube-api-access-whp46\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.061614 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f06dd885-034b-4e39-bb3b-689087c8a26c-catalog-content\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.063056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f06dd885-034b-4e39-bb3b-689087c8a26c-catalog-content\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.063171 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f06dd885-034b-4e39-bb3b-689087c8a26c-utilities\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.082356 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whp46\" (UniqueName: \"kubernetes.io/projected/f06dd885-034b-4e39-bb3b-689087c8a26c-kube-api-access-whp46\") pod \"redhat-operators-zpn7m\" (UID: \"f06dd885-034b-4e39-bb3b-689087c8a26c\") " pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.110968 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts2v8"] Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.114608 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:19 crc kubenswrapper[4744]: W0930 02:59:19.125978 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23313229_2e34_4cf4_988e_e273962bec95.slice/crio-11fd461b2ee557937cbada4f3b2ab9245e0b574d59641f38dc61a0c6d68b65b9 WatchSource:0}: Error finding container 11fd461b2ee557937cbada4f3b2ab9245e0b574d59641f38dc61a0c6d68b65b9: Status 404 returned error can't find the container with id 11fd461b2ee557937cbada4f3b2ab9245e0b574d59641f38dc61a0c6d68b65b9 Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.321790 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpn7m"] Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.513388 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab5a9f2-aa2f-462c-8f45-38a54be2359d" path="/var/lib/kubelet/pods/0ab5a9f2-aa2f-462c-8f45-38a54be2359d/volumes" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.514397 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91" path="/var/lib/kubelet/pods/0b2592cf-cbdf-4a1d-9f0d-c4b6b5665e91/volumes" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.515155 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4" path="/var/lib/kubelet/pods/13a4ab3e-6ffc-487f-aaa7-fc255de0bbe4/volumes" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.516519 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307311a7-837e-48b7-b54b-1830dab633a8" path="/var/lib/kubelet/pods/307311a7-837e-48b7-b54b-1830dab633a8/volumes" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.517304 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4c491e-16e3-4e31-a4a9-314d53ceada8" path="/var/lib/kubelet/pods/fd4c491e-16e3-4e31-a4a9-314d53ceada8/volumes" Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.940353 4744 generic.go:334] "Generic (PLEG): container finished" podID="23313229-2e34-4cf4-988e-e273962bec95" containerID="3dee63d7d7f194b49b122dc0eba2443aeb148fb229cc8a26f4f8b82d575fb06f" exitCode=0 Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.940431 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts2v8" event={"ID":"23313229-2e34-4cf4-988e-e273962bec95","Type":"ContainerDied","Data":"3dee63d7d7f194b49b122dc0eba2443aeb148fb229cc8a26f4f8b82d575fb06f"} Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.940893 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts2v8" event={"ID":"23313229-2e34-4cf4-988e-e273962bec95","Type":"ContainerStarted","Data":"11fd461b2ee557937cbada4f3b2ab9245e0b574d59641f38dc61a0c6d68b65b9"} Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.943555 4744 generic.go:334] "Generic (PLEG): container finished" podID="f06dd885-034b-4e39-bb3b-689087c8a26c" containerID="260401ec23efc29e4b5596d910fd53cf533b8f681c5f43b61862882eff205e37" exitCode=0 Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.943600 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpn7m" event={"ID":"f06dd885-034b-4e39-bb3b-689087c8a26c","Type":"ContainerDied","Data":"260401ec23efc29e4b5596d910fd53cf533b8f681c5f43b61862882eff205e37"} Sep 30 02:59:19 crc kubenswrapper[4744]: I0930 02:59:19.943648 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpn7m" event={"ID":"f06dd885-034b-4e39-bb3b-689087c8a26c","Type":"ContainerStarted","Data":"96dfe93d5290f873e5411253d9f7ac45227a45b64af227ba0ae7e997552dd7c0"} Sep 30 02:59:20 crc kubenswrapper[4744]: I0930 02:59:20.983072 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fxk7c"] Sep 30 02:59:20 crc kubenswrapper[4744]: I0930 02:59:20.985514 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:20 crc kubenswrapper[4744]: I0930 02:59:20.988663 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.004958 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxk7c"] Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.091179 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-utilities\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.091245 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr4lg\" (UniqueName: \"kubernetes.io/projected/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-kube-api-access-mr4lg\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.091291 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-catalog-content\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.193013 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ql7fl"] Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.193334 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-utilities\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.193554 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr4lg\" (UniqueName: \"kubernetes.io/projected/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-kube-api-access-mr4lg\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.193647 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-catalog-content\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.194464 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-catalog-content\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.195078 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-utilities\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.196475 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.206315 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.216999 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ql7fl"] Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.235049 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr4lg\" (UniqueName: \"kubernetes.io/projected/0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6-kube-api-access-mr4lg\") pod \"certified-operators-fxk7c\" (UID: \"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6\") " pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.295950 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slsqb\" (UniqueName: \"kubernetes.io/projected/a70de5bd-856c-42de-a059-e533218cf02b-kube-api-access-slsqb\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.296048 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a70de5bd-856c-42de-a059-e533218cf02b-utilities\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.296080 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a70de5bd-856c-42de-a059-e533218cf02b-catalog-content\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.325955 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.397748 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slsqb\" (UniqueName: \"kubernetes.io/projected/a70de5bd-856c-42de-a059-e533218cf02b-kube-api-access-slsqb\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.397817 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a70de5bd-856c-42de-a059-e533218cf02b-utilities\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.397854 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a70de5bd-856c-42de-a059-e533218cf02b-catalog-content\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.398590 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a70de5bd-856c-42de-a059-e533218cf02b-catalog-content\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.398630 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a70de5bd-856c-42de-a059-e533218cf02b-utilities\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.428358 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slsqb\" (UniqueName: \"kubernetes.io/projected/a70de5bd-856c-42de-a059-e533218cf02b-kube-api-access-slsqb\") pod \"community-operators-ql7fl\" (UID: \"a70de5bd-856c-42de-a059-e533218cf02b\") " pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.554573 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.595507 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxk7c"] Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.755980 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ql7fl"] Sep 30 02:59:21 crc kubenswrapper[4744]: W0930 02:59:21.908999 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70de5bd_856c_42de_a059_e533218cf02b.slice/crio-fe4777d341cc1a93835d49a9eabd55248437c5d3ec58a360e0e2e1ed5883b759 WatchSource:0}: Error finding container fe4777d341cc1a93835d49a9eabd55248437c5d3ec58a360e0e2e1ed5883b759: Status 404 returned error can't find the container with id fe4777d341cc1a93835d49a9eabd55248437c5d3ec58a360e0e2e1ed5883b759 Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.959636 4744 generic.go:334] "Generic (PLEG): container finished" podID="0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6" containerID="d105886aee68a8fb2b744d6a1348fbf4c123ac06fef4309cade2f7c3995d65ec" exitCode=0 Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.959726 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk7c" event={"ID":"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6","Type":"ContainerDied","Data":"d105886aee68a8fb2b744d6a1348fbf4c123ac06fef4309cade2f7c3995d65ec"} Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.960045 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk7c" event={"ID":"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6","Type":"ContainerStarted","Data":"128f6305bb60b79fa1b3ff547545cd7eaaeae79738fa020789d6d31566c2d3fd"} Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.962945 4744 generic.go:334] "Generic (PLEG): container finished" podID="23313229-2e34-4cf4-988e-e273962bec95" containerID="48241615c277b14d3bf4e83ae2dcd460c781b3c6870e2bea0bab110b4feac36c" exitCode=0 Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.963040 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts2v8" event={"ID":"23313229-2e34-4cf4-988e-e273962bec95","Type":"ContainerDied","Data":"48241615c277b14d3bf4e83ae2dcd460c781b3c6870e2bea0bab110b4feac36c"} Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.966077 4744 generic.go:334] "Generic (PLEG): container finished" podID="f06dd885-034b-4e39-bb3b-689087c8a26c" containerID="d5a8fa3d89f8bf7886de9753e0a41623b5d9fe8d3de89b5f34b2e28b40cef152" exitCode=0 Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.966150 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpn7m" event={"ID":"f06dd885-034b-4e39-bb3b-689087c8a26c","Type":"ContainerDied","Data":"d5a8fa3d89f8bf7886de9753e0a41623b5d9fe8d3de89b5f34b2e28b40cef152"} Sep 30 02:59:21 crc kubenswrapper[4744]: I0930 02:59:21.971073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ql7fl" event={"ID":"a70de5bd-856c-42de-a059-e533218cf02b","Type":"ContainerStarted","Data":"fe4777d341cc1a93835d49a9eabd55248437c5d3ec58a360e0e2e1ed5883b759"} Sep 30 02:59:22 crc kubenswrapper[4744]: I0930 02:59:22.980178 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts2v8" event={"ID":"23313229-2e34-4cf4-988e-e273962bec95","Type":"ContainerStarted","Data":"d42c6eff9639326ef5ce27b21b5cae1b409d8b933ede383eda220a7caf4f8dad"} Sep 30 02:59:22 crc kubenswrapper[4744]: I0930 02:59:22.982733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk7c" event={"ID":"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6","Type":"ContainerStarted","Data":"f23703831533d377f1d798a75b913b663b1e2278116dfc12ae4f3c515c948540"} Sep 30 02:59:22 crc kubenswrapper[4744]: I0930 02:59:22.985468 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpn7m" event={"ID":"f06dd885-034b-4e39-bb3b-689087c8a26c","Type":"ContainerStarted","Data":"e4935c05e96e8c36c7604aa0f280f793303a148a4fbdcb89324c4577f3bc2d8f"} Sep 30 02:59:22 crc kubenswrapper[4744]: I0930 02:59:22.987189 4744 generic.go:334] "Generic (PLEG): container finished" podID="a70de5bd-856c-42de-a059-e533218cf02b" containerID="43c9786028dff7ea97fdfc79162e5d219dd45876d51ad06f62ba565afae37cf4" exitCode=0 Sep 30 02:59:22 crc kubenswrapper[4744]: I0930 02:59:22.987214 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ql7fl" event={"ID":"a70de5bd-856c-42de-a059-e533218cf02b","Type":"ContainerDied","Data":"43c9786028dff7ea97fdfc79162e5d219dd45876d51ad06f62ba565afae37cf4"} Sep 30 02:59:22 crc kubenswrapper[4744]: I0930 02:59:22.998858 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ts2v8" podStartSLOduration=2.509710729 podStartE2EDuration="4.998842884s" podCreationTimestamp="2025-09-30 02:59:18 +0000 UTC" firstStartedPulling="2025-09-30 02:59:19.94607364 +0000 UTC m=+287.119293624" lastFinishedPulling="2025-09-30 02:59:22.435205775 +0000 UTC m=+289.608425779" observedRunningTime="2025-09-30 02:59:22.996584394 +0000 UTC m=+290.169804368" watchObservedRunningTime="2025-09-30 02:59:22.998842884 +0000 UTC m=+290.172062858" Sep 30 02:59:23 crc kubenswrapper[4744]: I0930 02:59:23.049761 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zpn7m" podStartSLOduration=2.280707021 podStartE2EDuration="5.049741575s" podCreationTimestamp="2025-09-30 02:59:18 +0000 UTC" firstStartedPulling="2025-09-30 02:59:19.946083921 +0000 UTC m=+287.119303935" lastFinishedPulling="2025-09-30 02:59:22.715118505 +0000 UTC m=+289.888338489" observedRunningTime="2025-09-30 02:59:23.049303921 +0000 UTC m=+290.222523895" watchObservedRunningTime="2025-09-30 02:59:23.049741575 +0000 UTC m=+290.222961549" Sep 30 02:59:24 crc kubenswrapper[4744]: I0930 02:59:24.005887 4744 generic.go:334] "Generic (PLEG): container finished" podID="0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6" containerID="f23703831533d377f1d798a75b913b663b1e2278116dfc12ae4f3c515c948540" exitCode=0 Sep 30 02:59:24 crc kubenswrapper[4744]: I0930 02:59:24.007051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk7c" event={"ID":"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6","Type":"ContainerDied","Data":"f23703831533d377f1d798a75b913b663b1e2278116dfc12ae4f3c515c948540"} Sep 30 02:59:25 crc kubenswrapper[4744]: I0930 02:59:25.016063 4744 generic.go:334] "Generic (PLEG): container finished" podID="a70de5bd-856c-42de-a059-e533218cf02b" containerID="19d94920aae638fa6b556de43c672d727b79225a056386a556464fe7d136d335" exitCode=0 Sep 30 02:59:25 crc kubenswrapper[4744]: I0930 02:59:25.016103 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ql7fl" event={"ID":"a70de5bd-856c-42de-a059-e533218cf02b","Type":"ContainerDied","Data":"19d94920aae638fa6b556de43c672d727b79225a056386a556464fe7d136d335"} Sep 30 02:59:26 crc kubenswrapper[4744]: I0930 02:59:26.034195 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ql7fl" event={"ID":"a70de5bd-856c-42de-a059-e533218cf02b","Type":"ContainerStarted","Data":"14be1f898919e89b1b14b7bdf5cdc6026e4da09d7eccaca724d345ddb8c49157"} Sep 30 02:59:26 crc kubenswrapper[4744]: I0930 02:59:26.037959 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxk7c" event={"ID":"0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6","Type":"ContainerStarted","Data":"34a0b29fcd80e53fdb7a985753e872e192da7e74a2b2b97af0f4f4e1fd269d82"} Sep 30 02:59:26 crc kubenswrapper[4744]: I0930 02:59:26.057449 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ql7fl" podStartSLOduration=2.556013203 podStartE2EDuration="5.057428007s" podCreationTimestamp="2025-09-30 02:59:21 +0000 UTC" firstStartedPulling="2025-09-30 02:59:22.988989289 +0000 UTC m=+290.162209263" lastFinishedPulling="2025-09-30 02:59:25.490404093 +0000 UTC m=+292.663624067" observedRunningTime="2025-09-30 02:59:26.052463934 +0000 UTC m=+293.225683908" watchObservedRunningTime="2025-09-30 02:59:26.057428007 +0000 UTC m=+293.230647981" Sep 30 02:59:26 crc kubenswrapper[4744]: I0930 02:59:26.073963 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fxk7c" podStartSLOduration=3.612015211 podStartE2EDuration="6.073940476s" podCreationTimestamp="2025-09-30 02:59:20 +0000 UTC" firstStartedPulling="2025-09-30 02:59:21.961276126 +0000 UTC m=+289.134496100" lastFinishedPulling="2025-09-30 02:59:24.423201391 +0000 UTC m=+291.596421365" observedRunningTime="2025-09-30 02:59:26.070932694 +0000 UTC m=+293.244152668" watchObservedRunningTime="2025-09-30 02:59:26.073940476 +0000 UTC m=+293.247160450" Sep 30 02:59:28 crc kubenswrapper[4744]: I0930 02:59:28.920961 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:28 crc kubenswrapper[4744]: I0930 02:59:28.921845 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:28 crc kubenswrapper[4744]: I0930 02:59:28.998218 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:29 crc kubenswrapper[4744]: I0930 02:59:29.091485 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ts2v8" Sep 30 02:59:29 crc kubenswrapper[4744]: I0930 02:59:29.115391 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:29 crc kubenswrapper[4744]: I0930 02:59:29.115450 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:29 crc kubenswrapper[4744]: I0930 02:59:29.177959 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:30 crc kubenswrapper[4744]: I0930 02:59:30.111888 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zpn7m" Sep 30 02:59:31 crc kubenswrapper[4744]: I0930 02:59:31.327238 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:31 crc kubenswrapper[4744]: I0930 02:59:31.327736 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:31 crc kubenswrapper[4744]: I0930 02:59:31.376267 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:31 crc kubenswrapper[4744]: I0930 02:59:31.555843 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:31 crc kubenswrapper[4744]: I0930 02:59:31.555913 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:31 crc kubenswrapper[4744]: I0930 02:59:31.607186 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:32 crc kubenswrapper[4744]: I0930 02:59:32.135141 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fxk7c" Sep 30 02:59:32 crc kubenswrapper[4744]: I0930 02:59:32.138629 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ql7fl" Sep 30 02:59:42 crc kubenswrapper[4744]: I0930 02:59:42.772228 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.145243 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj"] Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.148493 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.150515 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.150864 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.158565 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj"] Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.260572 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-secret-volume\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.260745 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjdsf\" (UniqueName: \"kubernetes.io/projected/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-kube-api-access-rjdsf\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.260796 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-config-volume\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.362356 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-secret-volume\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.362843 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjdsf\" (UniqueName: \"kubernetes.io/projected/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-kube-api-access-rjdsf\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.363237 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-config-volume\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.364949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-config-volume\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.373528 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-secret-volume\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.393232 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjdsf\" (UniqueName: \"kubernetes.io/projected/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-kube-api-access-rjdsf\") pod \"collect-profiles-29320020-6zrcj\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.478950 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:00 crc kubenswrapper[4744]: I0930 03:00:00.768064 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj"] Sep 30 03:00:01 crc kubenswrapper[4744]: I0930 03:00:01.259293 4744 generic.go:334] "Generic (PLEG): container finished" podID="6745fc8a-f3db-42e5-b034-4b20a40fe2bf" containerID="f6880137063365cd30d482582bc5023817815a8d74510201f7907af9635624c4" exitCode=0 Sep 30 03:00:01 crc kubenswrapper[4744]: I0930 03:00:01.259416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" event={"ID":"6745fc8a-f3db-42e5-b034-4b20a40fe2bf","Type":"ContainerDied","Data":"f6880137063365cd30d482582bc5023817815a8d74510201f7907af9635624c4"} Sep 30 03:00:01 crc kubenswrapper[4744]: I0930 03:00:01.259576 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" event={"ID":"6745fc8a-f3db-42e5-b034-4b20a40fe2bf","Type":"ContainerStarted","Data":"94368abdc2afb0574d5b503c39d8908618cab99d3b562c9abc865c7f4dc3a1cd"} Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.569905 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.693035 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-secret-volume\") pod \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.693310 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjdsf\" (UniqueName: \"kubernetes.io/projected/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-kube-api-access-rjdsf\") pod \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.693361 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-config-volume\") pod \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\" (UID: \"6745fc8a-f3db-42e5-b034-4b20a40fe2bf\") " Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.694195 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "6745fc8a-f3db-42e5-b034-4b20a40fe2bf" (UID: "6745fc8a-f3db-42e5-b034-4b20a40fe2bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.701568 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6745fc8a-f3db-42e5-b034-4b20a40fe2bf" (UID: "6745fc8a-f3db-42e5-b034-4b20a40fe2bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.702566 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-kube-api-access-rjdsf" (OuterVolumeSpecName: "kube-api-access-rjdsf") pod "6745fc8a-f3db-42e5-b034-4b20a40fe2bf" (UID: "6745fc8a-f3db-42e5-b034-4b20a40fe2bf"). InnerVolumeSpecName "kube-api-access-rjdsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.794462 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.794495 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:00:02 crc kubenswrapper[4744]: I0930 03:00:02.794531 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjdsf\" (UniqueName: \"kubernetes.io/projected/6745fc8a-f3db-42e5-b034-4b20a40fe2bf-kube-api-access-rjdsf\") on node \"crc\" DevicePath \"\"" Sep 30 03:00:03 crc kubenswrapper[4744]: I0930 03:00:03.275597 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" event={"ID":"6745fc8a-f3db-42e5-b034-4b20a40fe2bf","Type":"ContainerDied","Data":"94368abdc2afb0574d5b503c39d8908618cab99d3b562c9abc865c7f4dc3a1cd"} Sep 30 03:00:03 crc kubenswrapper[4744]: I0930 03:00:03.275670 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94368abdc2afb0574d5b503c39d8908618cab99d3b562c9abc865c7f4dc3a1cd" Sep 30 03:00:03 crc kubenswrapper[4744]: I0930 03:00:03.275681 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj" Sep 30 03:00:34 crc kubenswrapper[4744]: I0930 03:00:34.347508 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:00:34 crc kubenswrapper[4744]: I0930 03:00:34.348093 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:01:04 crc kubenswrapper[4744]: I0930 03:01:04.347710 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:01:04 crc kubenswrapper[4744]: I0930 03:01:04.348771 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.348028 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.348984 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.349194 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.350355 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94351d00f554b003d6de416947e6ae7daf34d530ef3993fcc2dd9dc065b4279b"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.350471 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://94351d00f554b003d6de416947e6ae7daf34d530ef3993fcc2dd9dc065b4279b" gracePeriod=600 Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.911841 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="94351d00f554b003d6de416947e6ae7daf34d530ef3993fcc2dd9dc065b4279b" exitCode=0 Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.911981 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"94351d00f554b003d6de416947e6ae7daf34d530ef3993fcc2dd9dc065b4279b"} Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.912475 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"1b74a2cf6f2555d1fa0644ad61981edd38de911946c34219f0d3be9f495b0022"} Sep 30 03:01:34 crc kubenswrapper[4744]: I0930 03:01:34.912519 4744 scope.go:117] "RemoveContainer" containerID="c4d257819c9f2dc38d837c80419803eac7ac1c8283c11fbdbbdfd4c62ae3c173" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.045992 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c95xr"] Sep 30 03:03:02 crc kubenswrapper[4744]: E0930 03:03:02.047017 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6745fc8a-f3db-42e5-b034-4b20a40fe2bf" containerName="collect-profiles" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.047038 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6745fc8a-f3db-42e5-b034-4b20a40fe2bf" containerName="collect-profiles" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.047218 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6745fc8a-f3db-42e5-b034-4b20a40fe2bf" containerName="collect-profiles" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.047957 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.066105 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c95xr"] Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.203906 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.203982 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtf4b\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-kube-api-access-gtf4b\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.204147 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-registry-tls\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.204309 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-registry-certificates\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.204437 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.204574 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-trusted-ca\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.204678 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.204803 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-bound-sa-token\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.252708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.306912 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-registry-tls\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.307004 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-registry-certificates\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.307033 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.307066 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-trusted-ca\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.307090 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-bound-sa-token\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.307129 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.307149 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtf4b\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-kube-api-access-gtf4b\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.308122 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.308195 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-registry-certificates\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.309460 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-trusted-ca\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.314968 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.315102 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-registry-tls\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.324280 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtf4b\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-kube-api-access-gtf4b\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.327789 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42a2120a-f1dc-47dc-b6b9-424e3d3c32bf-bound-sa-token\") pod \"image-registry-66df7c8f76-c95xr\" (UID: \"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf\") " pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.372183 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:02 crc kubenswrapper[4744]: I0930 03:03:02.682759 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c95xr"] Sep 30 03:03:03 crc kubenswrapper[4744]: I0930 03:03:03.555195 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" event={"ID":"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf","Type":"ContainerStarted","Data":"f06382f35e8d8ca6b3fcd89b2c5b7c6aa8c1c652a77da175eb1530a31096e1cd"} Sep 30 03:03:03 crc kubenswrapper[4744]: I0930 03:03:03.555650 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" event={"ID":"42a2120a-f1dc-47dc-b6b9-424e3d3c32bf","Type":"ContainerStarted","Data":"ee4aa70bcbce1cc15dd422abb89188d37a1fba80a096e5521837d2e1cedb1edb"} Sep 30 03:03:03 crc kubenswrapper[4744]: I0930 03:03:03.555678 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:03 crc kubenswrapper[4744]: I0930 03:03:03.585470 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" podStartSLOduration=1.585443084 podStartE2EDuration="1.585443084s" podCreationTimestamp="2025-09-30 03:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:03:03.583072739 +0000 UTC m=+510.756292743" watchObservedRunningTime="2025-09-30 03:03:03.585443084 +0000 UTC m=+510.758663088" Sep 30 03:03:22 crc kubenswrapper[4744]: I0930 03:03:22.380892 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-c95xr" Sep 30 03:03:22 crc kubenswrapper[4744]: I0930 03:03:22.457467 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q47dv"] Sep 30 03:03:34 crc kubenswrapper[4744]: I0930 03:03:34.347582 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:03:34 crc kubenswrapper[4744]: I0930 03:03:34.348785 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:03:47 crc kubenswrapper[4744]: I0930 03:03:47.502325 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" podUID="ac5f1e36-fb65-446e-92df-6d6bb5cca50d" containerName="registry" containerID="cri-o://5e19c127461a20409b4977abee7556895ac4e46683fc6bf51d89458299e66915" gracePeriod=30 Sep 30 03:03:47 crc kubenswrapper[4744]: I0930 03:03:47.863402 4744 generic.go:334] "Generic (PLEG): container finished" podID="ac5f1e36-fb65-446e-92df-6d6bb5cca50d" containerID="5e19c127461a20409b4977abee7556895ac4e46683fc6bf51d89458299e66915" exitCode=0 Sep 30 03:03:47 crc kubenswrapper[4744]: I0930 03:03:47.863448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" event={"ID":"ac5f1e36-fb65-446e-92df-6d6bb5cca50d","Type":"ContainerDied","Data":"5e19c127461a20409b4977abee7556895ac4e46683fc6bf51d89458299e66915"} Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.041002 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.145959 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blmrn\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-kube-api-access-blmrn\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.146022 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-tls\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.146061 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-ca-trust-extracted\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.146089 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-installation-pull-secrets\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.146239 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.146272 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-trusted-ca\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.146309 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-certificates\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.146410 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-bound-sa-token\") pod \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\" (UID: \"ac5f1e36-fb65-446e-92df-6d6bb5cca50d\") " Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.147551 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.147627 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.160117 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.160214 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-kube-api-access-blmrn" (OuterVolumeSpecName: "kube-api-access-blmrn") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "kube-api-access-blmrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.160698 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.161535 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.173481 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.183670 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ac5f1e36-fb65-446e-92df-6d6bb5cca50d" (UID: "ac5f1e36-fb65-446e-92df-6d6bb5cca50d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.248103 4744 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.248186 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.248211 4744 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.248229 4744 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.248246 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blmrn\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-kube-api-access-blmrn\") on node \"crc\" DevicePath \"\"" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.248263 4744 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.248282 4744 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ac5f1e36-fb65-446e-92df-6d6bb5cca50d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.872008 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" event={"ID":"ac5f1e36-fb65-446e-92df-6d6bb5cca50d","Type":"ContainerDied","Data":"e7bd018b3d08396b83b9b645a2cf67a871c73b867953081a4a7d7d6c85be3ece"} Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.872080 4744 scope.go:117] "RemoveContainer" containerID="5e19c127461a20409b4977abee7556895ac4e46683fc6bf51d89458299e66915" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.872196 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q47dv" Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.902989 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q47dv"] Sep 30 03:03:48 crc kubenswrapper[4744]: I0930 03:03:48.907567 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q47dv"] Sep 30 03:03:49 crc kubenswrapper[4744]: I0930 03:03:49.514833 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5f1e36-fb65-446e-92df-6d6bb5cca50d" path="/var/lib/kubelet/pods/ac5f1e36-fb65-446e-92df-6d6bb5cca50d/volumes" Sep 30 03:04:04 crc kubenswrapper[4744]: I0930 03:04:04.347824 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:04:04 crc kubenswrapper[4744]: I0930 03:04:04.348322 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.632079 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vhcl2"] Sep 30 03:04:28 crc kubenswrapper[4744]: E0930 03:04:28.632981 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5f1e36-fb65-446e-92df-6d6bb5cca50d" containerName="registry" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.633003 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5f1e36-fb65-446e-92df-6d6bb5cca50d" containerName="registry" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.633178 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5f1e36-fb65-446e-92df-6d6bb5cca50d" containerName="registry" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.633717 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.637009 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.638130 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-68p8s" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.638328 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.648135 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tcddl"] Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.649165 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-tcddl" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.652201 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-mz4qc" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.654210 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vhcl2"] Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.662288 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-fc4dx"] Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.663695 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.667021 4744 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cfgm8" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.672816 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tcddl"] Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.682865 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-fc4dx"] Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.768040 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cws\" (UniqueName: \"kubernetes.io/projected/c692f12b-868a-4985-8c61-529463a4bbf5-kube-api-access-84cws\") pod \"cert-manager-5b446d88c5-tcddl\" (UID: \"c692f12b-868a-4985-8c61-529463a4bbf5\") " pod="cert-manager/cert-manager-5b446d88c5-tcddl" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.768093 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9z58\" (UniqueName: \"kubernetes.io/projected/fa738a2a-d979-4352-82d3-ed7eb89e8fd9-kube-api-access-s9z58\") pod \"cert-manager-cainjector-7f985d654d-vhcl2\" (UID: \"fa738a2a-d979-4352-82d3-ed7eb89e8fd9\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.768126 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzf4h\" (UniqueName: \"kubernetes.io/projected/05f345c2-2e42-4cf0-85f6-6a40551d51d7-kube-api-access-jzf4h\") pod \"cert-manager-webhook-5655c58dd6-fc4dx\" (UID: \"05f345c2-2e42-4cf0-85f6-6a40551d51d7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.869238 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84cws\" (UniqueName: \"kubernetes.io/projected/c692f12b-868a-4985-8c61-529463a4bbf5-kube-api-access-84cws\") pod \"cert-manager-5b446d88c5-tcddl\" (UID: \"c692f12b-868a-4985-8c61-529463a4bbf5\") " pod="cert-manager/cert-manager-5b446d88c5-tcddl" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.869283 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9z58\" (UniqueName: \"kubernetes.io/projected/fa738a2a-d979-4352-82d3-ed7eb89e8fd9-kube-api-access-s9z58\") pod \"cert-manager-cainjector-7f985d654d-vhcl2\" (UID: \"fa738a2a-d979-4352-82d3-ed7eb89e8fd9\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.869321 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzf4h\" (UniqueName: \"kubernetes.io/projected/05f345c2-2e42-4cf0-85f6-6a40551d51d7-kube-api-access-jzf4h\") pod \"cert-manager-webhook-5655c58dd6-fc4dx\" (UID: \"05f345c2-2e42-4cf0-85f6-6a40551d51d7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.886666 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9z58\" (UniqueName: \"kubernetes.io/projected/fa738a2a-d979-4352-82d3-ed7eb89e8fd9-kube-api-access-s9z58\") pod \"cert-manager-cainjector-7f985d654d-vhcl2\" (UID: \"fa738a2a-d979-4352-82d3-ed7eb89e8fd9\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.886818 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cws\" (UniqueName: \"kubernetes.io/projected/c692f12b-868a-4985-8c61-529463a4bbf5-kube-api-access-84cws\") pod \"cert-manager-5b446d88c5-tcddl\" (UID: \"c692f12b-868a-4985-8c61-529463a4bbf5\") " pod="cert-manager/cert-manager-5b446d88c5-tcddl" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.887167 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzf4h\" (UniqueName: \"kubernetes.io/projected/05f345c2-2e42-4cf0-85f6-6a40551d51d7-kube-api-access-jzf4h\") pod \"cert-manager-webhook-5655c58dd6-fc4dx\" (UID: \"05f345c2-2e42-4cf0-85f6-6a40551d51d7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.958019 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.962692 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-tcddl" Sep 30 03:04:28 crc kubenswrapper[4744]: I0930 03:04:28.991577 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" Sep 30 03:04:29 crc kubenswrapper[4744]: I0930 03:04:29.261194 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-fc4dx"] Sep 30 03:04:29 crc kubenswrapper[4744]: I0930 03:04:29.265172 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:04:29 crc kubenswrapper[4744]: I0930 03:04:29.416629 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tcddl"] Sep 30 03:04:29 crc kubenswrapper[4744]: W0930 03:04:29.426412 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc692f12b_868a_4985_8c61_529463a4bbf5.slice/crio-c2dd6a6e23eda7b6e5b1ba644de386662585befc3b1c59c975dde5d5f7af0a67 WatchSource:0}: Error finding container c2dd6a6e23eda7b6e5b1ba644de386662585befc3b1c59c975dde5d5f7af0a67: Status 404 returned error can't find the container with id c2dd6a6e23eda7b6e5b1ba644de386662585befc3b1c59c975dde5d5f7af0a67 Sep 30 03:04:29 crc kubenswrapper[4744]: I0930 03:04:29.434790 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-vhcl2"] Sep 30 03:04:29 crc kubenswrapper[4744]: W0930 03:04:29.441867 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa738a2a_d979_4352_82d3_ed7eb89e8fd9.slice/crio-46cdc359ff172102e8eb99d048654ce2fccd0594e96750fc242abfd7b98e9392 WatchSource:0}: Error finding container 46cdc359ff172102e8eb99d048654ce2fccd0594e96750fc242abfd7b98e9392: Status 404 returned error can't find the container with id 46cdc359ff172102e8eb99d048654ce2fccd0594e96750fc242abfd7b98e9392 Sep 30 03:04:30 crc kubenswrapper[4744]: I0930 03:04:30.151907 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" event={"ID":"fa738a2a-d979-4352-82d3-ed7eb89e8fd9","Type":"ContainerStarted","Data":"46cdc359ff172102e8eb99d048654ce2fccd0594e96750fc242abfd7b98e9392"} Sep 30 03:04:30 crc kubenswrapper[4744]: I0930 03:04:30.153356 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-tcddl" event={"ID":"c692f12b-868a-4985-8c61-529463a4bbf5","Type":"ContainerStarted","Data":"c2dd6a6e23eda7b6e5b1ba644de386662585befc3b1c59c975dde5d5f7af0a67"} Sep 30 03:04:30 crc kubenswrapper[4744]: I0930 03:04:30.154205 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" event={"ID":"05f345c2-2e42-4cf0-85f6-6a40551d51d7","Type":"ContainerStarted","Data":"b82e603ef81fabb05cb99a3a0b09aa31958d4a64446f53fabce1c50a2c2c326a"} Sep 30 03:04:33 crc kubenswrapper[4744]: I0930 03:04:33.173679 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" event={"ID":"05f345c2-2e42-4cf0-85f6-6a40551d51d7","Type":"ContainerStarted","Data":"98145f508789dee46f32da93f4da5c5426803423506c05a49e19722083e97596"} Sep 30 03:04:33 crc kubenswrapper[4744]: I0930 03:04:33.174301 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" Sep 30 03:04:33 crc kubenswrapper[4744]: I0930 03:04:33.176239 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" event={"ID":"fa738a2a-d979-4352-82d3-ed7eb89e8fd9","Type":"ContainerStarted","Data":"94db6d1f744b43280a169edc3509b133892fe277ac7123e846303e4a3a8ac170"} Sep 30 03:04:33 crc kubenswrapper[4744]: I0930 03:04:33.178594 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-tcddl" event={"ID":"c692f12b-868a-4985-8c61-529463a4bbf5","Type":"ContainerStarted","Data":"654cc7296bd96fcdf0bf33d4bdd020320859d5f3105705ced65b744f83893ce9"} Sep 30 03:04:33 crc kubenswrapper[4744]: I0930 03:04:33.202694 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" podStartSLOduration=2.306850438 podStartE2EDuration="5.202668339s" podCreationTimestamp="2025-09-30 03:04:28 +0000 UTC" firstStartedPulling="2025-09-30 03:04:29.264990946 +0000 UTC m=+596.438210920" lastFinishedPulling="2025-09-30 03:04:32.160808837 +0000 UTC m=+599.334028821" observedRunningTime="2025-09-30 03:04:33.196749314 +0000 UTC m=+600.369969288" watchObservedRunningTime="2025-09-30 03:04:33.202668339 +0000 UTC m=+600.375888343" Sep 30 03:04:33 crc kubenswrapper[4744]: I0930 03:04:33.222910 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-tcddl" podStartSLOduration=2.309700376 podStartE2EDuration="5.222884417s" podCreationTimestamp="2025-09-30 03:04:28 +0000 UTC" firstStartedPulling="2025-09-30 03:04:29.430556435 +0000 UTC m=+596.603776419" lastFinishedPulling="2025-09-30 03:04:32.343740486 +0000 UTC m=+599.516960460" observedRunningTime="2025-09-30 03:04:33.221545146 +0000 UTC m=+600.394765150" watchObservedRunningTime="2025-09-30 03:04:33.222884417 +0000 UTC m=+600.396104411" Sep 30 03:04:33 crc kubenswrapper[4744]: I0930 03:04:33.250327 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-vhcl2" podStartSLOduration=2.285521406 podStartE2EDuration="5.250304071s" podCreationTimestamp="2025-09-30 03:04:28 +0000 UTC" firstStartedPulling="2025-09-30 03:04:29.444345764 +0000 UTC m=+596.617565758" lastFinishedPulling="2025-09-30 03:04:32.409128449 +0000 UTC m=+599.582348423" observedRunningTime="2025-09-30 03:04:33.249911258 +0000 UTC m=+600.423131272" watchObservedRunningTime="2025-09-30 03:04:33.250304071 +0000 UTC m=+600.423524045" Sep 30 03:04:34 crc kubenswrapper[4744]: I0930 03:04:34.348683 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:04:34 crc kubenswrapper[4744]: I0930 03:04:34.349178 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:04:34 crc kubenswrapper[4744]: I0930 03:04:34.349250 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:04:34 crc kubenswrapper[4744]: I0930 03:04:34.350264 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b74a2cf6f2555d1fa0644ad61981edd38de911946c34219f0d3be9f495b0022"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:04:34 crc kubenswrapper[4744]: I0930 03:04:34.350432 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://1b74a2cf6f2555d1fa0644ad61981edd38de911946c34219f0d3be9f495b0022" gracePeriod=600 Sep 30 03:04:35 crc kubenswrapper[4744]: I0930 03:04:35.199056 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="1b74a2cf6f2555d1fa0644ad61981edd38de911946c34219f0d3be9f495b0022" exitCode=0 Sep 30 03:04:35 crc kubenswrapper[4744]: I0930 03:04:35.199185 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"1b74a2cf6f2555d1fa0644ad61981edd38de911946c34219f0d3be9f495b0022"} Sep 30 03:04:35 crc kubenswrapper[4744]: I0930 03:04:35.199613 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"30f20d65f55e83fb7df6fb2f203d982a107f210e9c52e670591915139c564a0e"} Sep 30 03:04:35 crc kubenswrapper[4744]: I0930 03:04:35.199646 4744 scope.go:117] "RemoveContainer" containerID="94351d00f554b003d6de416947e6ae7daf34d530ef3993fcc2dd9dc065b4279b" Sep 30 03:04:38 crc kubenswrapper[4744]: I0930 03:04:38.995143 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-fc4dx" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.208060 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c5kw2"] Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.208687 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-controller" containerID="cri-o://bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.209182 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="sbdb" containerID="cri-o://94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.209262 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="nbdb" containerID="cri-o://599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.209320 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="northd" containerID="cri-o://663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.209406 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.209466 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-node" containerID="cri-o://ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.209570 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-acl-logging" containerID="cri-o://ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.329275 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" containerID="cri-o://f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" gracePeriod=30 Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.371665 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.373953 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.374225 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.376994 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.378249 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.378330 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="nbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.379412 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.379496 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="sbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.568890 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/3.log" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.571309 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovn-acl-logging/0.log" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.571973 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovn-controller/0.log" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.572481 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637677 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-ovn-kubernetes\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-bin\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637758 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637782 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-script-lib\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637779 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637800 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-etc-openvswitch\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637826 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637822 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637867 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-slash\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637897 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-env-overrides\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637917 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-kubelet\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637914 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637937 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-systemd\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638021 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-ovn\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638066 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-systemd-units\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.637998 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-slash" (OuterVolumeSpecName: "host-slash") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638065 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638061 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638112 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-netd\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638144 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638153 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-node-log\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638197 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-log-socket\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638236 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-netns\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638187 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-node-log" (OuterVolumeSpecName: "node-log") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638225 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-log-socket" (OuterVolumeSpecName: "log-socket") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638264 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-openvswitch\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638287 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-var-lib-openvswitch\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638329 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovn-node-metrics-cert\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638288 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638305 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638317 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638359 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638358 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-config\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmplx\" (UniqueName: \"kubernetes.io/projected/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-kube-api-access-cmplx\") pod \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\" (UID: \"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc\") " Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638481 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638830 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638957 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.638981 4744 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639001 4744 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639019 4744 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639037 4744 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639055 4744 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639071 4744 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639086 4744 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639103 4744 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639119 4744 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639135 4744 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639152 4744 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639169 4744 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639185 4744 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639203 4744 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639219 4744 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.639235 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645057 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7gqc9"] Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645416 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645444 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645463 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645478 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645498 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="sbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645538 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="sbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645563 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-node" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645576 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-node" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645591 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="nbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645603 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="nbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645615 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645628 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645644 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645656 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645681 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="northd" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645693 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="northd" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645709 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-acl-logging" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645720 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-acl-logging" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645737 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645750 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.645767 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kubecfg-setup" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645779 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kubecfg-setup" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645858 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645941 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-acl-logging" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645963 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="sbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645979 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.645996 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="nbdb" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646014 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646028 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovn-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646044 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646062 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-node" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646079 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646095 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="northd" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.646260 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646273 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: E0930 03:04:39.646295 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646308 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646482 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646502 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerName="ovnkube-controller" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.646675 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-kube-api-access-cmplx" (OuterVolumeSpecName: "kube-api-access-cmplx") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "kube-api-access-cmplx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.649184 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.665475 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" (UID: "0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740287 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-run-netns\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740356 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovn-node-metrics-cert\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740414 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-ovn\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740621 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-cni-netd\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-var-lib-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740744 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-systemd-units\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740832 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-systemd\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740878 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknqv\" (UniqueName: \"kubernetes.io/projected/ef49b4f0-49ef-46dc-bb93-55247fb68df7-kube-api-access-mknqv\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740922 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-run-ovn-kubernetes\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.740995 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-slash\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741031 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-log-socket\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741065 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-etc-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-cni-bin\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741154 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-node-log\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741226 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovnkube-script-lib\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741288 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-kubelet\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741411 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovnkube-config\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741499 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741546 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-env-overrides\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741642 4744 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741678 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.741755 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmplx\" (UniqueName: \"kubernetes.io/projected/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc-kube-api-access-cmplx\") on node \"crc\" DevicePath \"\"" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843267 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-systemd\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843311 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknqv\" (UniqueName: \"kubernetes.io/projected/ef49b4f0-49ef-46dc-bb93-55247fb68df7-kube-api-access-mknqv\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843329 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-run-ovn-kubernetes\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843344 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843363 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-slash\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843414 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-log-socket\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843430 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-etc-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843448 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-cni-bin\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843471 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-node-log\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843492 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovnkube-script-lib\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843515 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-kubelet\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843512 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-systemd\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843549 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-run-ovn-kubernetes\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843611 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-node-log\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-etc-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843669 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-cni-bin\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843696 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-slash\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843539 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovnkube-config\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843733 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843670 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-kubelet\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843755 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-log-socket\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843848 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-env-overrides\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843911 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843969 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-run-netns\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.843966 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844027 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovn-node-metrics-cert\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844055 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-run-netns\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844075 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-ovn\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844130 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-run-ovn\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-cni-netd\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-var-lib-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844355 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-systemd-units\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-env-overrides\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844527 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-var-lib-openvswitch\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844620 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-systemd-units\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844671 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ef49b4f0-49ef-46dc-bb93-55247fb68df7-host-cni-netd\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.844892 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovnkube-script-lib\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.846680 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovnkube-config\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.850066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef49b4f0-49ef-46dc-bb93-55247fb68df7-ovn-node-metrics-cert\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.870115 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknqv\" (UniqueName: \"kubernetes.io/projected/ef49b4f0-49ef-46dc-bb93-55247fb68df7-kube-api-access-mknqv\") pod \"ovnkube-node-7gqc9\" (UID: \"ef49b4f0-49ef-46dc-bb93-55247fb68df7\") " pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:39 crc kubenswrapper[4744]: I0930 03:04:39.973852 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.315441 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovnkube-controller/3.log" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.319155 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovn-acl-logging/0.log" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.319930 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c5kw2_0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/ovn-controller/0.log" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320532 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" exitCode=0 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320570 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" exitCode=0 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320580 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" exitCode=0 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320594 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" exitCode=0 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320604 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" exitCode=0 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320613 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" exitCode=0 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320622 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" exitCode=143 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320631 4744 generic.go:334] "Generic (PLEG): container finished" podID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" containerID="bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" exitCode=143 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320631 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320713 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320742 4744 scope.go:117] "RemoveContainer" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320769 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320790 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320614 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320811 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320832 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320854 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320867 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320879 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320891 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320903 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320915 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320927 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320938 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320955 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320971 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320984 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.320998 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321009 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321021 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321033 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321044 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321056 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321067 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321078 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321094 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321111 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321127 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321140 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321152 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321164 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321176 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321188 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321199 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321211 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321222 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321238 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c5kw2" event={"ID":"0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc","Type":"ContainerDied","Data":"3ba22fc8f802c825101b75a2921185f90beaf2cc982907ead9558a1118b4e1e2"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321255 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321268 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321281 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321293 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321306 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321320 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321332 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321344 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321356 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.321405 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.326587 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/2.log" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.327312 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/1.log" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.327361 4744 generic.go:334] "Generic (PLEG): container finished" podID="6561e3c6-a8d1-4dc8-8bd3-09f042393658" containerID="87fcda1a58aa577149c9ec3a622519ff80ed9a5e10e797f9632a1ac862b78ced" exitCode=2 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.327416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerDied","Data":"87fcda1a58aa577149c9ec3a622519ff80ed9a5e10e797f9632a1ac862b78ced"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.327458 4744 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.328085 4744 scope.go:117] "RemoveContainer" containerID="87fcda1a58aa577149c9ec3a622519ff80ed9a5e10e797f9632a1ac862b78ced" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.328287 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nxppc_openshift-multus(6561e3c6-a8d1-4dc8-8bd3-09f042393658)\"" pod="openshift-multus/multus-nxppc" podUID="6561e3c6-a8d1-4dc8-8bd3-09f042393658" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.333388 4744 generic.go:334] "Generic (PLEG): container finished" podID="ef49b4f0-49ef-46dc-bb93-55247fb68df7" containerID="6717448f5992f1805502ec338a2f5ab9e929b2b56ed0e4e23d94192f527d878c" exitCode=0 Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.333429 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerDied","Data":"6717448f5992f1805502ec338a2f5ab9e929b2b56ed0e4e23d94192f527d878c"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.333454 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"e286f716cd42bd78275f51506c30e7d0acf3279ffffcb6fd8e6b43b36313f8a0"} Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.394324 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.414624 4744 scope.go:117] "RemoveContainer" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.445901 4744 scope.go:117] "RemoveContainer" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.451667 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c5kw2"] Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.458424 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c5kw2"] Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.472341 4744 scope.go:117] "RemoveContainer" containerID="663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.488121 4744 scope.go:117] "RemoveContainer" containerID="ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.500811 4744 scope.go:117] "RemoveContainer" containerID="ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.522652 4744 scope.go:117] "RemoveContainer" containerID="ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.578320 4744 scope.go:117] "RemoveContainer" containerID="bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.596643 4744 scope.go:117] "RemoveContainer" containerID="27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.633842 4744 scope.go:117] "RemoveContainer" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.634722 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": container with ID starting with f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13 not found: ID does not exist" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.634752 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} err="failed to get container status \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": rpc error: code = NotFound desc = could not find container \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": container with ID starting with f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.634774 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.635174 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": container with ID starting with 8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af not found: ID does not exist" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.635201 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} err="failed to get container status \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": rpc error: code = NotFound desc = could not find container \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": container with ID starting with 8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.635220 4744 scope.go:117] "RemoveContainer" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.635517 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": container with ID starting with 94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268 not found: ID does not exist" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.635566 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} err="failed to get container status \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": rpc error: code = NotFound desc = could not find container \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": container with ID starting with 94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.635596 4744 scope.go:117] "RemoveContainer" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.636075 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": container with ID starting with 599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf not found: ID does not exist" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.636096 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} err="failed to get container status \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": rpc error: code = NotFound desc = could not find container \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": container with ID starting with 599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.636109 4744 scope.go:117] "RemoveContainer" containerID="663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.636394 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": container with ID starting with 663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c not found: ID does not exist" containerID="663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.636425 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} err="failed to get container status \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": rpc error: code = NotFound desc = could not find container \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": container with ID starting with 663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.636443 4744 scope.go:117] "RemoveContainer" containerID="ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.636717 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": container with ID starting with ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543 not found: ID does not exist" containerID="ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.636738 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} err="failed to get container status \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": rpc error: code = NotFound desc = could not find container \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": container with ID starting with ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.636755 4744 scope.go:117] "RemoveContainer" containerID="ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.637189 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": container with ID starting with ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd not found: ID does not exist" containerID="ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.637233 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} err="failed to get container status \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": rpc error: code = NotFound desc = could not find container \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": container with ID starting with ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.637265 4744 scope.go:117] "RemoveContainer" containerID="ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.637572 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": container with ID starting with ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c not found: ID does not exist" containerID="ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.637593 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} err="failed to get container status \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": rpc error: code = NotFound desc = could not find container \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": container with ID starting with ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.637605 4744 scope.go:117] "RemoveContainer" containerID="bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.638038 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": container with ID starting with bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725 not found: ID does not exist" containerID="bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.638075 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} err="failed to get container status \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": rpc error: code = NotFound desc = could not find container \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": container with ID starting with bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.638094 4744 scope.go:117] "RemoveContainer" containerID="27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee" Sep 30 03:04:40 crc kubenswrapper[4744]: E0930 03:04:40.638400 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": container with ID starting with 27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee not found: ID does not exist" containerID="27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.638423 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} err="failed to get container status \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": rpc error: code = NotFound desc = could not find container \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": container with ID starting with 27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.638439 4744 scope.go:117] "RemoveContainer" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.638664 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} err="failed to get container status \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": rpc error: code = NotFound desc = could not find container \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": container with ID starting with f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.638689 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.639236 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} err="failed to get container status \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": rpc error: code = NotFound desc = could not find container \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": container with ID starting with 8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.639267 4744 scope.go:117] "RemoveContainer" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.639579 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} err="failed to get container status \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": rpc error: code = NotFound desc = could not find container \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": container with ID starting with 94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.639609 4744 scope.go:117] "RemoveContainer" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.639872 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} err="failed to get container status \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": rpc error: code = NotFound desc = could not find container \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": container with ID starting with 599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.639898 4744 scope.go:117] "RemoveContainer" containerID="663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.640095 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} err="failed to get container status \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": rpc error: code = NotFound desc = could not find container \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": container with ID starting with 663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.640121 4744 scope.go:117] "RemoveContainer" containerID="ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.640423 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} err="failed to get container status \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": rpc error: code = NotFound desc = could not find container \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": container with ID starting with ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.640443 4744 scope.go:117] "RemoveContainer" containerID="ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.640628 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} err="failed to get container status \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": rpc error: code = NotFound desc = could not find container \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": container with ID starting with ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.640657 4744 scope.go:117] "RemoveContainer" containerID="ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.641129 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} err="failed to get container status \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": rpc error: code = NotFound desc = could not find container \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": container with ID starting with ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.641149 4744 scope.go:117] "RemoveContainer" containerID="bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.641420 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} err="failed to get container status \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": rpc error: code = NotFound desc = could not find container \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": container with ID starting with bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.641441 4744 scope.go:117] "RemoveContainer" containerID="27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.641754 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} err="failed to get container status \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": rpc error: code = NotFound desc = could not find container \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": container with ID starting with 27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.641799 4744 scope.go:117] "RemoveContainer" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.642210 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} err="failed to get container status \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": rpc error: code = NotFound desc = could not find container \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": container with ID starting with f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.642276 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.642738 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} err="failed to get container status \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": rpc error: code = NotFound desc = could not find container \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": container with ID starting with 8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.642761 4744 scope.go:117] "RemoveContainer" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.643043 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} err="failed to get container status \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": rpc error: code = NotFound desc = could not find container \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": container with ID starting with 94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.643090 4744 scope.go:117] "RemoveContainer" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.643399 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} err="failed to get container status \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": rpc error: code = NotFound desc = could not find container \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": container with ID starting with 599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.643425 4744 scope.go:117] "RemoveContainer" containerID="663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.643651 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} err="failed to get container status \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": rpc error: code = NotFound desc = could not find container \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": container with ID starting with 663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.643682 4744 scope.go:117] "RemoveContainer" containerID="ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.644053 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} err="failed to get container status \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": rpc error: code = NotFound desc = could not find container \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": container with ID starting with ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.644075 4744 scope.go:117] "RemoveContainer" containerID="ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.644398 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} err="failed to get container status \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": rpc error: code = NotFound desc = could not find container \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": container with ID starting with ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.644426 4744 scope.go:117] "RemoveContainer" containerID="ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.644665 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} err="failed to get container status \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": rpc error: code = NotFound desc = could not find container \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": container with ID starting with ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.644695 4744 scope.go:117] "RemoveContainer" containerID="bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.645026 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} err="failed to get container status \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": rpc error: code = NotFound desc = could not find container \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": container with ID starting with bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.645057 4744 scope.go:117] "RemoveContainer" containerID="27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.645429 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} err="failed to get container status \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": rpc error: code = NotFound desc = could not find container \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": container with ID starting with 27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.645463 4744 scope.go:117] "RemoveContainer" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.646046 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} err="failed to get container status \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": rpc error: code = NotFound desc = could not find container \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": container with ID starting with f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.646105 4744 scope.go:117] "RemoveContainer" containerID="8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.646576 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af"} err="failed to get container status \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": rpc error: code = NotFound desc = could not find container \"8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af\": container with ID starting with 8beb1e9e02c3c6da6a72f75f3d9855b39e448384e55ac5e22d908b80fc2425af not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.646602 4744 scope.go:117] "RemoveContainer" containerID="94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.646936 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268"} err="failed to get container status \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": rpc error: code = NotFound desc = could not find container \"94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268\": container with ID starting with 94c0fe7c60647a4e39998b486e19f1bba66ba55c4a8efda1bfcc13161a2b3268 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.646958 4744 scope.go:117] "RemoveContainer" containerID="599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.647275 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf"} err="failed to get container status \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": rpc error: code = NotFound desc = could not find container \"599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf\": container with ID starting with 599b266a237063daba6bcd00d24c6293dc62c01c13dd03da3448be7361d30fbf not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.647303 4744 scope.go:117] "RemoveContainer" containerID="663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.647583 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c"} err="failed to get container status \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": rpc error: code = NotFound desc = could not find container \"663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c\": container with ID starting with 663218a7c63e68dd68f0dac14f67427dd9340383320683abe012ff14712d4f5c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.647606 4744 scope.go:117] "RemoveContainer" containerID="ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.648249 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543"} err="failed to get container status \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": rpc error: code = NotFound desc = could not find container \"ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543\": container with ID starting with ec611e66fafb0eaa830ba641966695911e40c116c1ab2f2bb69ed038df33b543 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.648277 4744 scope.go:117] "RemoveContainer" containerID="ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.648720 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd"} err="failed to get container status \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": rpc error: code = NotFound desc = could not find container \"ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd\": container with ID starting with ceaf13a891c40231b89a42ee37bdb7ae176535a469ee786126543f3631d637cd not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.648740 4744 scope.go:117] "RemoveContainer" containerID="ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.649008 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c"} err="failed to get container status \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": rpc error: code = NotFound desc = could not find container \"ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c\": container with ID starting with ff47a4acbf67c1d802cf55e3cd73c8bfc7735caf055e86a763c398ba647b279c not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.649046 4744 scope.go:117] "RemoveContainer" containerID="bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.649446 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725"} err="failed to get container status \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": rpc error: code = NotFound desc = could not find container \"bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725\": container with ID starting with bbb2e2c755bf341761839153379466f2c7d0a7edd6f3ffa0055ed577d75bd725 not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.649466 4744 scope.go:117] "RemoveContainer" containerID="27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.649763 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee"} err="failed to get container status \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": rpc error: code = NotFound desc = could not find container \"27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee\": container with ID starting with 27594c0002db2db4d9657f7208e0d51706b40edfe685644120a3828455336fee not found: ID does not exist" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.649805 4744 scope.go:117] "RemoveContainer" containerID="f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13" Sep 30 03:04:40 crc kubenswrapper[4744]: I0930 03:04:40.650169 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13"} err="failed to get container status \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": rpc error: code = NotFound desc = could not find container \"f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13\": container with ID starting with f17f8725d0c0dccb7081a747c1f0cf9a42bd61340eccb1e5eb14e36b5ef1ec13 not found: ID does not exist" Sep 30 03:04:41 crc kubenswrapper[4744]: I0930 03:04:41.346465 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"ea2d86c87b6ce9fd062be95d222d5ae39bc9216d0ae718270fcf443a7c3ba6e8"} Sep 30 03:04:41 crc kubenswrapper[4744]: I0930 03:04:41.346511 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"90d4cace62adcd2771ba82fbbca9dc247ef4f4796ad8895e982b4e762aafec08"} Sep 30 03:04:41 crc kubenswrapper[4744]: I0930 03:04:41.346523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"faff448ba946f6bdfc5622d4709ccfa7f3b82846bd851ed57bca36d4c575f53d"} Sep 30 03:04:41 crc kubenswrapper[4744]: I0930 03:04:41.346535 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"0b0a67fa8d44b0878f193378842153895910ba94f8b34e8a0e26de51322195df"} Sep 30 03:04:41 crc kubenswrapper[4744]: I0930 03:04:41.346545 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"66e0609a2256f5f79a2c9dfb4453d669a8b08bc8b61a5bc2c3b240c69ad9e1f6"} Sep 30 03:04:41 crc kubenswrapper[4744]: I0930 03:04:41.346555 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"7d933850681e6cd478ab8e243937a6c33dd9af32b1e02d92cbfb13586e8b95bb"} Sep 30 03:04:41 crc kubenswrapper[4744]: I0930 03:04:41.515947 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc" path="/var/lib/kubelet/pods/0aeabd38-6a1f-4ca5-b992-0dfe8ac0ecfc/volumes" Sep 30 03:04:44 crc kubenswrapper[4744]: I0930 03:04:44.376513 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"99128aa55088e4fb48b11cec5eac713139f454cdadbda8c1de43957ddd6acdac"} Sep 30 03:04:46 crc kubenswrapper[4744]: I0930 03:04:46.393592 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" event={"ID":"ef49b4f0-49ef-46dc-bb93-55247fb68df7","Type":"ContainerStarted","Data":"3d33e880f27adc1df8b8a9415eb09bea1f597b35b7cff3d035559afde19c9218"} Sep 30 03:04:46 crc kubenswrapper[4744]: I0930 03:04:46.394846 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:46 crc kubenswrapper[4744]: I0930 03:04:46.429618 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:46 crc kubenswrapper[4744]: I0930 03:04:46.435854 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" podStartSLOduration=7.435832335 podStartE2EDuration="7.435832335s" podCreationTimestamp="2025-09-30 03:04:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:04:46.432216233 +0000 UTC m=+613.605436257" watchObservedRunningTime="2025-09-30 03:04:46.435832335 +0000 UTC m=+613.609052349" Sep 30 03:04:47 crc kubenswrapper[4744]: I0930 03:04:47.402515 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:47 crc kubenswrapper[4744]: I0930 03:04:47.402589 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:47 crc kubenswrapper[4744]: I0930 03:04:47.448160 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:04:51 crc kubenswrapper[4744]: I0930 03:04:51.504311 4744 scope.go:117] "RemoveContainer" containerID="87fcda1a58aa577149c9ec3a622519ff80ed9a5e10e797f9632a1ac862b78ced" Sep 30 03:04:51 crc kubenswrapper[4744]: E0930 03:04:51.505527 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nxppc_openshift-multus(6561e3c6-a8d1-4dc8-8bd3-09f042393658)\"" pod="openshift-multus/multus-nxppc" podUID="6561e3c6-a8d1-4dc8-8bd3-09f042393658" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.099914 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.102143 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.105079 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.105488 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.107262 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mlnbz" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.197239 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-run\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.197287 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-data\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.197578 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h287r\" (UniqueName: \"kubernetes.io/projected/800e9149-6d7e-4196-bad2-e747131c3e34-kube-api-access-h287r\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.197662 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-log\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.299271 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-run\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.299318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-data\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.299404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h287r\" (UniqueName: \"kubernetes.io/projected/800e9149-6d7e-4196-bad2-e747131c3e34-kube-api-access-h287r\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.299444 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-log\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.300026 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-log\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.300098 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-run\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.300265 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/800e9149-6d7e-4196-bad2-e747131c3e34-data\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.324799 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h287r\" (UniqueName: \"kubernetes.io/projected/800e9149-6d7e-4196-bad2-e747131c3e34-kube-api-access-h287r\") pod \"ceph\" (UID: \"800e9149-6d7e-4196-bad2-e747131c3e34\") " pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.428307 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Sep 30 03:05:03 crc kubenswrapper[4744]: W0930 03:05:03.466742 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod800e9149_6d7e_4196_bad2_e747131c3e34.slice/crio-6f8c8dadcf49cc3dd12bd1e47b01ea352bd2489c88c4e35dd01f7f40392c3dfc WatchSource:0}: Error finding container 6f8c8dadcf49cc3dd12bd1e47b01ea352bd2489c88c4e35dd01f7f40392c3dfc: Status 404 returned error can't find the container with id 6f8c8dadcf49cc3dd12bd1e47b01ea352bd2489c88c4e35dd01f7f40392c3dfc Sep 30 03:05:03 crc kubenswrapper[4744]: I0930 03:05:03.522246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"800e9149-6d7e-4196-bad2-e747131c3e34","Type":"ContainerStarted","Data":"6f8c8dadcf49cc3dd12bd1e47b01ea352bd2489c88c4e35dd01f7f40392c3dfc"} Sep 30 03:05:06 crc kubenswrapper[4744]: I0930 03:05:06.503156 4744 scope.go:117] "RemoveContainer" containerID="87fcda1a58aa577149c9ec3a622519ff80ed9a5e10e797f9632a1ac862b78ced" Sep 30 03:05:07 crc kubenswrapper[4744]: I0930 03:05:07.543034 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/2.log" Sep 30 03:05:07 crc kubenswrapper[4744]: I0930 03:05:07.543737 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/1.log" Sep 30 03:05:07 crc kubenswrapper[4744]: I0930 03:05:07.543783 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nxppc" event={"ID":"6561e3c6-a8d1-4dc8-8bd3-09f042393658","Type":"ContainerStarted","Data":"c9ff9385c8ef59b116700d811614b968080fc2848f2201bced559e13522f5ef0"} Sep 30 03:05:10 crc kubenswrapper[4744]: I0930 03:05:10.039125 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7gqc9" Sep 30 03:05:19 crc kubenswrapper[4744]: E0930 03:05:19.312321 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Sep 30 03:05:19 crc kubenswrapper[4744]: E0930 03:05:19.313093 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h287r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_openstack(800e9149-6d7e-4196-bad2-e747131c3e34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:05:19 crc kubenswrapper[4744]: E0930 03:05:19.314719 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceph" podUID="800e9149-6d7e-4196-bad2-e747131c3e34" Sep 30 03:05:19 crc kubenswrapper[4744]: E0930 03:05:19.615705 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="openstack/ceph" podUID="800e9149-6d7e-4196-bad2-e747131c3e34" Sep 30 03:05:32 crc kubenswrapper[4744]: I0930 03:05:32.705828 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"800e9149-6d7e-4196-bad2-e747131c3e34","Type":"ContainerStarted","Data":"20c57489d87e4763143747c7c09e8ada6c3a4263a01733dc344bee608e8c2095"} Sep 30 03:05:32 crc kubenswrapper[4744]: I0930 03:05:32.726649 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=1.123962475 podStartE2EDuration="29.726618683s" podCreationTimestamp="2025-09-30 03:05:03 +0000 UTC" firstStartedPulling="2025-09-30 03:05:03.470990027 +0000 UTC m=+630.644210031" lastFinishedPulling="2025-09-30 03:05:32.073646235 +0000 UTC m=+659.246866239" observedRunningTime="2025-09-30 03:05:32.726320874 +0000 UTC m=+659.899540858" watchObservedRunningTime="2025-09-30 03:05:32.726618683 +0000 UTC m=+659.899838697" Sep 30 03:05:33 crc kubenswrapper[4744]: I0930 03:05:33.684452 4744 scope.go:117] "RemoveContainer" containerID="cb8f1f4434989f4ca65aafc1e21e88a02079bdf1f5c2789aa582f15c6a05bfea" Sep 30 03:05:34 crc kubenswrapper[4744]: I0930 03:05:34.724203 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nxppc_6561e3c6-a8d1-4dc8-8bd3-09f042393658/kube-multus/2.log" Sep 30 03:05:35 crc kubenswrapper[4744]: E0930 03:05:35.531334 4744 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:48226->38.102.83.51:40449: write tcp 38.102.83.51:48226->38.102.83.51:40449: write: connection reset by peer Sep 30 03:06:26 crc kubenswrapper[4744]: E0930 03:06:26.353061 4744 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:39222->38.102.83.51:40449: write tcp 38.102.83.51:39222->38.102.83.51:40449: write: broken pipe Sep 30 03:06:34 crc kubenswrapper[4744]: I0930 03:06:34.348295 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:06:34 crc kubenswrapper[4744]: I0930 03:06:34.348928 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:06:51 crc kubenswrapper[4744]: I0930 03:06:51.962421 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl"] Sep 30 03:06:51 crc kubenswrapper[4744]: I0930 03:06:51.964024 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:51 crc kubenswrapper[4744]: I0930 03:06:51.968141 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 03:06:51 crc kubenswrapper[4744]: I0930 03:06:51.974952 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl"] Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.060484 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj485\" (UniqueName: \"kubernetes.io/projected/9146dff8-2558-4464-8f88-5700a10d2ab3-kube-api-access-wj485\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.060612 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.060718 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.161650 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.161756 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.161838 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj485\" (UniqueName: \"kubernetes.io/projected/9146dff8-2558-4464-8f88-5700a10d2ab3-kube-api-access-wj485\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.162548 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.162707 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.188877 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj485\" (UniqueName: \"kubernetes.io/projected/9146dff8-2558-4464-8f88-5700a10d2ab3-kube-api-access-wj485\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.293004 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:52 crc kubenswrapper[4744]: I0930 03:06:52.616026 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl"] Sep 30 03:06:53 crc kubenswrapper[4744]: I0930 03:06:53.313306 4744 generic.go:334] "Generic (PLEG): container finished" podID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerID="650878b72312cb880dbe794ea94dee20a626e30ebb01ada28248390af28b093e" exitCode=0 Sep 30 03:06:53 crc kubenswrapper[4744]: I0930 03:06:53.313750 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" event={"ID":"9146dff8-2558-4464-8f88-5700a10d2ab3","Type":"ContainerDied","Data":"650878b72312cb880dbe794ea94dee20a626e30ebb01ada28248390af28b093e"} Sep 30 03:06:53 crc kubenswrapper[4744]: I0930 03:06:53.313786 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" event={"ID":"9146dff8-2558-4464-8f88-5700a10d2ab3","Type":"ContainerStarted","Data":"bde883bfa42cd5d676f610a628381f38089065ca9a79fd31f828088992bfbc6a"} Sep 30 03:06:55 crc kubenswrapper[4744]: I0930 03:06:55.326234 4744 generic.go:334] "Generic (PLEG): container finished" podID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerID="64614087b6c52c52e12a8c5568d3e20cbc71f0257d0a373180f72a8e82f1a165" exitCode=0 Sep 30 03:06:55 crc kubenswrapper[4744]: I0930 03:06:55.326415 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" event={"ID":"9146dff8-2558-4464-8f88-5700a10d2ab3","Type":"ContainerDied","Data":"64614087b6c52c52e12a8c5568d3e20cbc71f0257d0a373180f72a8e82f1a165"} Sep 30 03:06:56 crc kubenswrapper[4744]: I0930 03:06:56.339488 4744 generic.go:334] "Generic (PLEG): container finished" podID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerID="d67d5993a8be399d870af7b49265f673c0655a1d407878377bdfaa9a18fcd1fa" exitCode=0 Sep 30 03:06:56 crc kubenswrapper[4744]: I0930 03:06:56.339552 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" event={"ID":"9146dff8-2558-4464-8f88-5700a10d2ab3","Type":"ContainerDied","Data":"d67d5993a8be399d870af7b49265f673c0655a1d407878377bdfaa9a18fcd1fa"} Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.653483 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.748918 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-util\") pod \"9146dff8-2558-4464-8f88-5700a10d2ab3\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.749070 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-bundle\") pod \"9146dff8-2558-4464-8f88-5700a10d2ab3\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.749095 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj485\" (UniqueName: \"kubernetes.io/projected/9146dff8-2558-4464-8f88-5700a10d2ab3-kube-api-access-wj485\") pod \"9146dff8-2558-4464-8f88-5700a10d2ab3\" (UID: \"9146dff8-2558-4464-8f88-5700a10d2ab3\") " Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.749933 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-bundle" (OuterVolumeSpecName: "bundle") pod "9146dff8-2558-4464-8f88-5700a10d2ab3" (UID: "9146dff8-2558-4464-8f88-5700a10d2ab3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.754684 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9146dff8-2558-4464-8f88-5700a10d2ab3-kube-api-access-wj485" (OuterVolumeSpecName: "kube-api-access-wj485") pod "9146dff8-2558-4464-8f88-5700a10d2ab3" (UID: "9146dff8-2558-4464-8f88-5700a10d2ab3"). InnerVolumeSpecName "kube-api-access-wj485". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.765047 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-util" (OuterVolumeSpecName: "util") pod "9146dff8-2558-4464-8f88-5700a10d2ab3" (UID: "9146dff8-2558-4464-8f88-5700a10d2ab3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.851351 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.851649 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj485\" (UniqueName: \"kubernetes.io/projected/9146dff8-2558-4464-8f88-5700a10d2ab3-kube-api-access-wj485\") on node \"crc\" DevicePath \"\"" Sep 30 03:06:57 crc kubenswrapper[4744]: I0930 03:06:57.851811 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9146dff8-2558-4464-8f88-5700a10d2ab3-util\") on node \"crc\" DevicePath \"\"" Sep 30 03:06:58 crc kubenswrapper[4744]: I0930 03:06:58.358153 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" event={"ID":"9146dff8-2558-4464-8f88-5700a10d2ab3","Type":"ContainerDied","Data":"bde883bfa42cd5d676f610a628381f38089065ca9a79fd31f828088992bfbc6a"} Sep 30 03:06:58 crc kubenswrapper[4744]: I0930 03:06:58.358210 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde883bfa42cd5d676f610a628381f38089065ca9a79fd31f828088992bfbc6a" Sep 30 03:06:58 crc kubenswrapper[4744]: I0930 03:06:58.358273 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.552983 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f"] Sep 30 03:06:59 crc kubenswrapper[4744]: E0930 03:06:59.553214 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerName="extract" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.553228 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerName="extract" Sep 30 03:06:59 crc kubenswrapper[4744]: E0930 03:06:59.553238 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerName="pull" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.553245 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerName="pull" Sep 30 03:06:59 crc kubenswrapper[4744]: E0930 03:06:59.553262 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerName="util" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.553270 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerName="util" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.553417 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9146dff8-2558-4464-8f88-5700a10d2ab3" containerName="extract" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.553830 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.555835 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-8fn89" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.555860 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.555837 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.572227 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f"] Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.572252 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2f6w\" (UniqueName: \"kubernetes.io/projected/1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c-kube-api-access-t2f6w\") pod \"nmstate-operator-5d6f6cfd66-8lx9f\" (UID: \"1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.674477 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2f6w\" (UniqueName: \"kubernetes.io/projected/1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c-kube-api-access-t2f6w\") pod \"nmstate-operator-5d6f6cfd66-8lx9f\" (UID: \"1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.694235 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2f6w\" (UniqueName: \"kubernetes.io/projected/1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c-kube-api-access-t2f6w\") pod \"nmstate-operator-5d6f6cfd66-8lx9f\" (UID: \"1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" Sep 30 03:06:59 crc kubenswrapper[4744]: I0930 03:06:59.869715 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" Sep 30 03:07:00 crc kubenswrapper[4744]: I0930 03:07:00.129697 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f"] Sep 30 03:07:00 crc kubenswrapper[4744]: W0930 03:07:00.138078 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b8b0be2_d1e6_44b8_b5ba_af3ee8cfed0c.slice/crio-c62c77f6c53f40e007a25c7be0a5b4a1fb2bb3748a92dd7295ec8af0886743b9 WatchSource:0}: Error finding container c62c77f6c53f40e007a25c7be0a5b4a1fb2bb3748a92dd7295ec8af0886743b9: Status 404 returned error can't find the container with id c62c77f6c53f40e007a25c7be0a5b4a1fb2bb3748a92dd7295ec8af0886743b9 Sep 30 03:07:00 crc kubenswrapper[4744]: I0930 03:07:00.369072 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" event={"ID":"1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c","Type":"ContainerStarted","Data":"c62c77f6c53f40e007a25c7be0a5b4a1fb2bb3748a92dd7295ec8af0886743b9"} Sep 30 03:07:02 crc kubenswrapper[4744]: I0930 03:07:02.383833 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" event={"ID":"1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c","Type":"ContainerStarted","Data":"054b64a53b07da9f66215c0aee3aa0ffc4e8766aa7dc92b8ec86312df686fe8d"} Sep 30 03:07:02 crc kubenswrapper[4744]: I0930 03:07:02.406269 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-8lx9f" podStartSLOduration=1.46216412 podStartE2EDuration="3.406241759s" podCreationTimestamp="2025-09-30 03:06:59 +0000 UTC" firstStartedPulling="2025-09-30 03:07:00.140438313 +0000 UTC m=+747.313658287" lastFinishedPulling="2025-09-30 03:07:02.084515952 +0000 UTC m=+749.257735926" observedRunningTime="2025-09-30 03:07:02.404052581 +0000 UTC m=+749.577272615" watchObservedRunningTime="2025-09-30 03:07:02.406241759 +0000 UTC m=+749.579461763" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.474830 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.478393 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.481868 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-lclrw"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.482825 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bsncc" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.482906 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.484509 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.487160 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.521266 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-lclrw"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.535405 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-57mt4"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.536272 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.536865 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/37a9886e-c2c0-46ab-a260-57231999e956-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-lclrw\" (UID: \"37a9886e-c2c0-46ab-a260-57231999e956\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.536910 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whfb\" (UniqueName: \"kubernetes.io/projected/37a9886e-c2c0-46ab-a260-57231999e956-kube-api-access-9whfb\") pod \"nmstate-webhook-6d689559c5-lclrw\" (UID: \"37a9886e-c2c0-46ab-a260-57231999e956\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.536976 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw8m6\" (UniqueName: \"kubernetes.io/projected/ca163139-502b-44cc-ae53-83bc49866259-kube-api-access-dw8m6\") pod \"nmstate-metrics-58fcddf996-tt4lr\" (UID: \"ca163139-502b-44cc-ae53-83bc49866259\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.604529 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.605701 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.609978 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.610249 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.610455 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-tf4h8" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.620099 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.637625 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-ovs-socket\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.637706 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c58a40af-7fd8-4a82-8109-855fbb1c32f3-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.637874 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnxr\" (UniqueName: \"kubernetes.io/projected/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-kube-api-access-9rnxr\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.637932 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/37a9886e-c2c0-46ab-a260-57231999e956-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-lclrw\" (UID: \"37a9886e-c2c0-46ab-a260-57231999e956\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.637990 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whfb\" (UniqueName: \"kubernetes.io/projected/37a9886e-c2c0-46ab-a260-57231999e956-kube-api-access-9whfb\") pod \"nmstate-webhook-6d689559c5-lclrw\" (UID: \"37a9886e-c2c0-46ab-a260-57231999e956\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.638046 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c58a40af-7fd8-4a82-8109-855fbb1c32f3-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.638120 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-dbus-socket\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.638210 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8m6\" (UniqueName: \"kubernetes.io/projected/ca163139-502b-44cc-ae53-83bc49866259-kube-api-access-dw8m6\") pod \"nmstate-metrics-58fcddf996-tt4lr\" (UID: \"ca163139-502b-44cc-ae53-83bc49866259\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.638248 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-nmstate-lock\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.638275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcln\" (UniqueName: \"kubernetes.io/projected/c58a40af-7fd8-4a82-8109-855fbb1c32f3-kube-api-access-7hcln\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.646003 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/37a9886e-c2c0-46ab-a260-57231999e956-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-lclrw\" (UID: \"37a9886e-c2c0-46ab-a260-57231999e956\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.653935 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whfb\" (UniqueName: \"kubernetes.io/projected/37a9886e-c2c0-46ab-a260-57231999e956-kube-api-access-9whfb\") pod \"nmstate-webhook-6d689559c5-lclrw\" (UID: \"37a9886e-c2c0-46ab-a260-57231999e956\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.677310 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw8m6\" (UniqueName: \"kubernetes.io/projected/ca163139-502b-44cc-ae53-83bc49866259-kube-api-access-dw8m6\") pod \"nmstate-metrics-58fcddf996-tt4lr\" (UID: \"ca163139-502b-44cc-ae53-83bc49866259\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740079 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c58a40af-7fd8-4a82-8109-855fbb1c32f3-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740166 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-dbus-socket\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740227 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-nmstate-lock\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740255 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hcln\" (UniqueName: \"kubernetes.io/projected/c58a40af-7fd8-4a82-8109-855fbb1c32f3-kube-api-access-7hcln\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740300 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-ovs-socket\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740330 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c58a40af-7fd8-4a82-8109-855fbb1c32f3-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740378 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnxr\" (UniqueName: \"kubernetes.io/projected/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-kube-api-access-9rnxr\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740456 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-dbus-socket\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740669 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-nmstate-lock\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.740700 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-ovs-socket\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.741238 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c58a40af-7fd8-4a82-8109-855fbb1c32f3-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.743557 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c58a40af-7fd8-4a82-8109-855fbb1c32f3-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.761186 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hcln\" (UniqueName: \"kubernetes.io/projected/c58a40af-7fd8-4a82-8109-855fbb1c32f3-kube-api-access-7hcln\") pod \"nmstate-console-plugin-864bb6dfb5-kg924\" (UID: \"c58a40af-7fd8-4a82-8109-855fbb1c32f3\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.766704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnxr\" (UniqueName: \"kubernetes.io/projected/1b5d08ff-f7f3-4f08-a8be-dd45390037e4-kube-api-access-9rnxr\") pod \"nmstate-handler-57mt4\" (UID: \"1b5d08ff-f7f3-4f08-a8be-dd45390037e4\") " pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.803223 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.806619 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-649d7b75dc-2h4tl"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.807302 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.815757 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.829116 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-649d7b75dc-2h4tl"] Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.841658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-trusted-ca-bundle\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.841695 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-service-ca\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.841717 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66vpm\" (UniqueName: \"kubernetes.io/projected/c71c1f25-489e-4e86-a414-8376c9aaf25f-kube-api-access-66vpm\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.841746 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-oauth-config\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.841885 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-config\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.841956 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-oauth-serving-cert\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.842008 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-serving-cert\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.862158 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.925746 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.943196 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-serving-cert\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.943251 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-trusted-ca-bundle\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.943274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-service-ca\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.943297 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66vpm\" (UniqueName: \"kubernetes.io/projected/c71c1f25-489e-4e86-a414-8376c9aaf25f-kube-api-access-66vpm\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.943318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-oauth-config\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.943351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-config\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.943397 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-oauth-serving-cert\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.944163 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-oauth-serving-cert\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.944170 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-service-ca\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.945349 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-config\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.946003 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c71c1f25-489e-4e86-a414-8376c9aaf25f-trusted-ca-bundle\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.949006 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-serving-cert\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.949439 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c71c1f25-489e-4e86-a414-8376c9aaf25f-console-oauth-config\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:03 crc kubenswrapper[4744]: I0930 03:07:03.963221 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66vpm\" (UniqueName: \"kubernetes.io/projected/c71c1f25-489e-4e86-a414-8376c9aaf25f-kube-api-access-66vpm\") pod \"console-649d7b75dc-2h4tl\" (UID: \"c71c1f25-489e-4e86-a414-8376c9aaf25f\") " pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.132237 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr"] Sep 30 03:07:04 crc kubenswrapper[4744]: W0930 03:07:04.149719 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca163139_502b_44cc_ae53_83bc49866259.slice/crio-32d414294979c5998744ce0c364c7f20ffbd01310f033b44673756d29c4017af WatchSource:0}: Error finding container 32d414294979c5998744ce0c364c7f20ffbd01310f033b44673756d29c4017af: Status 404 returned error can't find the container with id 32d414294979c5998744ce0c364c7f20ffbd01310f033b44673756d29c4017af Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.174301 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.177520 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924"] Sep 30 03:07:04 crc kubenswrapper[4744]: W0930 03:07:04.185448 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc58a40af_7fd8_4a82_8109_855fbb1c32f3.slice/crio-bdb8b56330e7fd3c1da51b77e05058015f0fdfff7bebf1078fbdec6b236c26a7 WatchSource:0}: Error finding container bdb8b56330e7fd3c1da51b77e05058015f0fdfff7bebf1078fbdec6b236c26a7: Status 404 returned error can't find the container with id bdb8b56330e7fd3c1da51b77e05058015f0fdfff7bebf1078fbdec6b236c26a7 Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.264928 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-lclrw"] Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.347875 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.347914 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.375008 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-649d7b75dc-2h4tl"] Sep 30 03:07:04 crc kubenswrapper[4744]: W0930 03:07:04.379562 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc71c1f25_489e_4e86_a414_8376c9aaf25f.slice/crio-235678315f7f40c6481db976ed443f93f0d4e15bea555be577050e5cf8d112b2 WatchSource:0}: Error finding container 235678315f7f40c6481db976ed443f93f0d4e15bea555be577050e5cf8d112b2: Status 404 returned error can't find the container with id 235678315f7f40c6481db976ed443f93f0d4e15bea555be577050e5cf8d112b2 Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.395246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" event={"ID":"c58a40af-7fd8-4a82-8109-855fbb1c32f3","Type":"ContainerStarted","Data":"bdb8b56330e7fd3c1da51b77e05058015f0fdfff7bebf1078fbdec6b236c26a7"} Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.396155 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-57mt4" event={"ID":"1b5d08ff-f7f3-4f08-a8be-dd45390037e4","Type":"ContainerStarted","Data":"57348ecca0ae8df626245083d65a294b5d93c1a07a97c7abe61eb0930e6a1b0a"} Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.397676 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" event={"ID":"ca163139-502b-44cc-ae53-83bc49866259","Type":"ContainerStarted","Data":"32d414294979c5998744ce0c364c7f20ffbd01310f033b44673756d29c4017af"} Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.399820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" event={"ID":"37a9886e-c2c0-46ab-a260-57231999e956","Type":"ContainerStarted","Data":"9f11997116dffb8f70a6235cb73852cb201708c8b7f10e020b6b94943e2a6045"} Sep 30 03:07:04 crc kubenswrapper[4744]: I0930 03:07:04.401275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-649d7b75dc-2h4tl" event={"ID":"c71c1f25-489e-4e86-a414-8376c9aaf25f","Type":"ContainerStarted","Data":"235678315f7f40c6481db976ed443f93f0d4e15bea555be577050e5cf8d112b2"} Sep 30 03:07:05 crc kubenswrapper[4744]: I0930 03:07:05.410642 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-649d7b75dc-2h4tl" event={"ID":"c71c1f25-489e-4e86-a414-8376c9aaf25f","Type":"ContainerStarted","Data":"f7f9fe6a0f660d67f4281659199b576ef3f699ceb77ae73ef7c5fcf7c1409f9b"} Sep 30 03:07:05 crc kubenswrapper[4744]: I0930 03:07:05.430534 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-649d7b75dc-2h4tl" podStartSLOduration=2.430515107 podStartE2EDuration="2.430515107s" podCreationTimestamp="2025-09-30 03:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:07:05.427501274 +0000 UTC m=+752.600721248" watchObservedRunningTime="2025-09-30 03:07:05.430515107 +0000 UTC m=+752.603735081" Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.427648 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" event={"ID":"c58a40af-7fd8-4a82-8109-855fbb1c32f3","Type":"ContainerStarted","Data":"4301a8f250c98e5494a4604bc47b9343b597cf4e61b52f9b4d6f28f0f3e50364"} Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.429578 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-57mt4" event={"ID":"1b5d08ff-f7f3-4f08-a8be-dd45390037e4","Type":"ContainerStarted","Data":"fc0c972d06eec71c585b4b25ef22a903b191226adbfa15e8225bc06efd2ec5e4"} Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.429809 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.440967 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" event={"ID":"ca163139-502b-44cc-ae53-83bc49866259","Type":"ContainerStarted","Data":"dfd023ca8d9dc329cc1dce397ba9efb5fc414117c9eaaa82972bfea98902ace9"} Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.448680 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" event={"ID":"37a9886e-c2c0-46ab-a260-57231999e956","Type":"ContainerStarted","Data":"5749b5edc3ccce16fdcf7257f81f54a7dcaa975683709b0b4e5286e44f5f54e6"} Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.450725 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.493194 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-57mt4" podStartSLOduration=1.568231596 podStartE2EDuration="4.493162535s" podCreationTimestamp="2025-09-30 03:07:03 +0000 UTC" firstStartedPulling="2025-09-30 03:07:03.91700811 +0000 UTC m=+751.090228074" lastFinishedPulling="2025-09-30 03:07:06.84193904 +0000 UTC m=+754.015159013" observedRunningTime="2025-09-30 03:07:07.486295591 +0000 UTC m=+754.659515605" watchObservedRunningTime="2025-09-30 03:07:07.493162535 +0000 UTC m=+754.666382539" Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.494466 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-kg924" podStartSLOduration=1.896185147 podStartE2EDuration="4.494453625s" podCreationTimestamp="2025-09-30 03:07:03 +0000 UTC" firstStartedPulling="2025-09-30 03:07:04.19892475 +0000 UTC m=+751.372144724" lastFinishedPulling="2025-09-30 03:07:06.797193218 +0000 UTC m=+753.970413202" observedRunningTime="2025-09-30 03:07:07.457686001 +0000 UTC m=+754.630906015" watchObservedRunningTime="2025-09-30 03:07:07.494453625 +0000 UTC m=+754.667673639" Sep 30 03:07:07 crc kubenswrapper[4744]: I0930 03:07:07.525901 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" podStartSLOduration=1.999539432 podStartE2EDuration="4.525863722s" podCreationTimestamp="2025-09-30 03:07:03 +0000 UTC" firstStartedPulling="2025-09-30 03:07:04.281636272 +0000 UTC m=+751.454856246" lastFinishedPulling="2025-09-30 03:07:06.807960562 +0000 UTC m=+753.981180536" observedRunningTime="2025-09-30 03:07:07.515212011 +0000 UTC m=+754.688432025" watchObservedRunningTime="2025-09-30 03:07:07.525863722 +0000 UTC m=+754.699083706" Sep 30 03:07:09 crc kubenswrapper[4744]: I0930 03:07:09.467692 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" event={"ID":"ca163139-502b-44cc-ae53-83bc49866259","Type":"ContainerStarted","Data":"9c481a2fdc1c6d1f3b244147f5be69460e81e36bee74367b9c593a07fa23c488"} Sep 30 03:07:09 crc kubenswrapper[4744]: I0930 03:07:09.492736 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tt4lr" podStartSLOduration=1.6722693130000001 podStartE2EDuration="6.49271332s" podCreationTimestamp="2025-09-30 03:07:03 +0000 UTC" firstStartedPulling="2025-09-30 03:07:04.151898447 +0000 UTC m=+751.325118411" lastFinishedPulling="2025-09-30 03:07:08.972342424 +0000 UTC m=+756.145562418" observedRunningTime="2025-09-30 03:07:09.490197532 +0000 UTC m=+756.663417556" watchObservedRunningTime="2025-09-30 03:07:09.49271332 +0000 UTC m=+756.665933334" Sep 30 03:07:13 crc kubenswrapper[4744]: I0930 03:07:13.901198 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-57mt4" Sep 30 03:07:14 crc kubenswrapper[4744]: I0930 03:07:14.175424 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:14 crc kubenswrapper[4744]: I0930 03:07:14.176595 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:14 crc kubenswrapper[4744]: I0930 03:07:14.183715 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:14 crc kubenswrapper[4744]: I0930 03:07:14.509320 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-649d7b75dc-2h4tl" Sep 30 03:07:14 crc kubenswrapper[4744]: I0930 03:07:14.584281 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7gvsx"] Sep 30 03:07:22 crc kubenswrapper[4744]: I0930 03:07:22.883646 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tp75b"] Sep 30 03:07:22 crc kubenswrapper[4744]: I0930 03:07:22.884820 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" podUID="8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" containerName="controller-manager" containerID="cri-o://5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073" gracePeriod=30 Sep 30 03:07:22 crc kubenswrapper[4744]: I0930 03:07:22.983148 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd"] Sep 30 03:07:22 crc kubenswrapper[4744]: I0930 03:07:22.983524 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" podUID="e0918c73-ef87-42c8-8395-9499c5a91e2b" containerName="route-controller-manager" containerID="cri-o://458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d" gracePeriod=30 Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.392082 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.404474 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448437 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-client-ca\") pod \"e0918c73-ef87-42c8-8395-9499c5a91e2b\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448570 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-client-ca\") pod \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448622 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-proxy-ca-bundles\") pod \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448669 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jlll\" (UniqueName: \"kubernetes.io/projected/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-kube-api-access-4jlll\") pod \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448726 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-serving-cert\") pod \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448778 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l46mg\" (UniqueName: \"kubernetes.io/projected/e0918c73-ef87-42c8-8395-9499c5a91e2b-kube-api-access-l46mg\") pod \"e0918c73-ef87-42c8-8395-9499c5a91e2b\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448838 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0918c73-ef87-42c8-8395-9499c5a91e2b-serving-cert\") pod \"e0918c73-ef87-42c8-8395-9499c5a91e2b\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448900 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-config\") pod \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\" (UID: \"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.448948 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-config\") pod \"e0918c73-ef87-42c8-8395-9499c5a91e2b\" (UID: \"e0918c73-ef87-42c8-8395-9499c5a91e2b\") " Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.449135 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0918c73-ef87-42c8-8395-9499c5a91e2b" (UID: "e0918c73-ef87-42c8-8395-9499c5a91e2b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.449173 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" (UID: "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.449284 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-client-ca" (OuterVolumeSpecName: "client-ca") pod "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" (UID: "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.449638 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.449652 4744 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.449663 4744 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.449754 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-config" (OuterVolumeSpecName: "config") pod "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" (UID: "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.450018 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-config" (OuterVolumeSpecName: "config") pod "e0918c73-ef87-42c8-8395-9499c5a91e2b" (UID: "e0918c73-ef87-42c8-8395-9499c5a91e2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.457864 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-kube-api-access-4jlll" (OuterVolumeSpecName: "kube-api-access-4jlll") pod "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" (UID: "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9"). InnerVolumeSpecName "kube-api-access-4jlll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.457901 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0918c73-ef87-42c8-8395-9499c5a91e2b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0918c73-ef87-42c8-8395-9499c5a91e2b" (UID: "e0918c73-ef87-42c8-8395-9499c5a91e2b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.457909 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" (UID: "8b7426d8-1f6e-4dfc-b3b4-daf337153ce9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.457949 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0918c73-ef87-42c8-8395-9499c5a91e2b-kube-api-access-l46mg" (OuterVolumeSpecName: "kube-api-access-l46mg") pod "e0918c73-ef87-42c8-8395-9499c5a91e2b" (UID: "e0918c73-ef87-42c8-8395-9499c5a91e2b"). InnerVolumeSpecName "kube-api-access-l46mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.551744 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.551791 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l46mg\" (UniqueName: \"kubernetes.io/projected/e0918c73-ef87-42c8-8395-9499c5a91e2b-kube-api-access-l46mg\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.551813 4744 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0918c73-ef87-42c8-8395-9499c5a91e2b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.551832 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.551848 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0918c73-ef87-42c8-8395-9499c5a91e2b-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.551865 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jlll\" (UniqueName: \"kubernetes.io/projected/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9-kube-api-access-4jlll\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.568460 4744 generic.go:334] "Generic (PLEG): container finished" podID="e0918c73-ef87-42c8-8395-9499c5a91e2b" containerID="458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d" exitCode=0 Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.568494 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.568578 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" event={"ID":"e0918c73-ef87-42c8-8395-9499c5a91e2b","Type":"ContainerDied","Data":"458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d"} Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.568638 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd" event={"ID":"e0918c73-ef87-42c8-8395-9499c5a91e2b","Type":"ContainerDied","Data":"ebbe0cc6b2fbdcb6d685054617d0536dbb2338e24d1974116ea7b4bbbbac40a2"} Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.568659 4744 scope.go:117] "RemoveContainer" containerID="458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.572031 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.572143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" event={"ID":"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9","Type":"ContainerDied","Data":"5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073"} Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.572259 4744 generic.go:334] "Generic (PLEG): container finished" podID="8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" containerID="5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073" exitCode=0 Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.572305 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tp75b" event={"ID":"8b7426d8-1f6e-4dfc-b3b4-daf337153ce9","Type":"ContainerDied","Data":"87c6a8d567736a15119d5cf41bf7cb70a2c9bb8185834ebc2b3773bf9f4b9339"} Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.606327 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd"] Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.608961 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pjvjd"] Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.611912 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tp75b"] Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.614747 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tp75b"] Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.698383 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl"] Sep 30 03:07:23 crc kubenswrapper[4744]: E0930 03:07:23.698597 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0918c73-ef87-42c8-8395-9499c5a91e2b" containerName="route-controller-manager" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.698608 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0918c73-ef87-42c8-8395-9499c5a91e2b" containerName="route-controller-manager" Sep 30 03:07:23 crc kubenswrapper[4744]: E0930 03:07:23.698622 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" containerName="controller-manager" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.698628 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" containerName="controller-manager" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.698724 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0918c73-ef87-42c8-8395-9499c5a91e2b" containerName="route-controller-manager" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.698737 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" containerName="controller-manager" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.699085 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.706475 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.706690 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.706808 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.709672 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.709805 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.716059 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.719458 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl"] Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.728776 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58b65f8cf-bxrwn"] Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.729684 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.742736 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.743084 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.743815 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.743863 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.744146 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.744226 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.747325 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.752113 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58b65f8cf-bxrwn"] Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.753307 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q8bz\" (UniqueName: \"kubernetes.io/projected/8bf011fe-7512-403a-a247-91e0e01ecaf9-kube-api-access-9q8bz\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.753386 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bf011fe-7512-403a-a247-91e0e01ecaf9-serving-cert\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.753422 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf011fe-7512-403a-a247-91e0e01ecaf9-config\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.753458 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bf011fe-7512-403a-a247-91e0e01ecaf9-client-ca\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.820711 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-lclrw" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.840591 4744 scope.go:117] "RemoveContainer" containerID="458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d" Sep 30 03:07:23 crc kubenswrapper[4744]: E0930 03:07:23.840858 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d\": container with ID starting with 458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d not found: ID does not exist" containerID="458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.840888 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d"} err="failed to get container status \"458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d\": rpc error: code = NotFound desc = could not find container \"458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d\": container with ID starting with 458fa217155f60665c9cc7604c9275adf48eadd695e3aa4ce0bf0b3fd608e24d not found: ID does not exist" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.840908 4744 scope.go:117] "RemoveContainer" containerID="5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.854497 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bf011fe-7512-403a-a247-91e0e01ecaf9-serving-cert\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.854752 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-proxy-ca-bundles\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.854860 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf011fe-7512-403a-a247-91e0e01ecaf9-config\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.854896 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-config\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.854946 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5nm\" (UniqueName: \"kubernetes.io/projected/13cf6306-d7c7-4bf3-b076-36d79226109d-kube-api-access-xx5nm\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.854969 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bf011fe-7512-403a-a247-91e0e01ecaf9-client-ca\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.854992 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-client-ca\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.855035 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13cf6306-d7c7-4bf3-b076-36d79226109d-serving-cert\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.855051 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q8bz\" (UniqueName: \"kubernetes.io/projected/8bf011fe-7512-403a-a247-91e0e01ecaf9-kube-api-access-9q8bz\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.855173 4744 scope.go:117] "RemoveContainer" containerID="5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.855785 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bf011fe-7512-403a-a247-91e0e01ecaf9-client-ca\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: E0930 03:07:23.855879 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073\": container with ID starting with 5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073 not found: ID does not exist" containerID="5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.855904 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073"} err="failed to get container status \"5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073\": rpc error: code = NotFound desc = could not find container \"5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073\": container with ID starting with 5e5fe7ae63bb08d000a8e656215e2eb7289a61e8d12478169d9b4756ecfa6073 not found: ID does not exist" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.856182 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf011fe-7512-403a-a247-91e0e01ecaf9-config\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.859575 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bf011fe-7512-403a-a247-91e0e01ecaf9-serving-cert\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.881140 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q8bz\" (UniqueName: \"kubernetes.io/projected/8bf011fe-7512-403a-a247-91e0e01ecaf9-kube-api-access-9q8bz\") pod \"route-controller-manager-79bf699b9c-qbfpl\" (UID: \"8bf011fe-7512-403a-a247-91e0e01ecaf9\") " pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.956883 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13cf6306-d7c7-4bf3-b076-36d79226109d-serving-cert\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.957012 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-proxy-ca-bundles\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.957057 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-config\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.957109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5nm\" (UniqueName: \"kubernetes.io/projected/13cf6306-d7c7-4bf3-b076-36d79226109d-kube-api-access-xx5nm\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.957150 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-client-ca\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.958458 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-client-ca\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.959486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-config\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.960698 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13cf6306-d7c7-4bf3-b076-36d79226109d-proxy-ca-bundles\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.962905 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13cf6306-d7c7-4bf3-b076-36d79226109d-serving-cert\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:23 crc kubenswrapper[4744]: I0930 03:07:23.980822 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5nm\" (UniqueName: \"kubernetes.io/projected/13cf6306-d7c7-4bf3-b076-36d79226109d-kube-api-access-xx5nm\") pod \"controller-manager-58b65f8cf-bxrwn\" (UID: \"13cf6306-d7c7-4bf3-b076-36d79226109d\") " pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:24 crc kubenswrapper[4744]: I0930 03:07:24.014670 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:24 crc kubenswrapper[4744]: I0930 03:07:24.053559 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:24 crc kubenswrapper[4744]: I0930 03:07:24.246410 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl"] Sep 30 03:07:24 crc kubenswrapper[4744]: W0930 03:07:24.255167 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bf011fe_7512_403a_a247_91e0e01ecaf9.slice/crio-3e4c1680199e4ef41f1a02f7726007966cd99fb8a6e0bacfb2b756d929d8f4f9 WatchSource:0}: Error finding container 3e4c1680199e4ef41f1a02f7726007966cd99fb8a6e0bacfb2b756d929d8f4f9: Status 404 returned error can't find the container with id 3e4c1680199e4ef41f1a02f7726007966cd99fb8a6e0bacfb2b756d929d8f4f9 Sep 30 03:07:24 crc kubenswrapper[4744]: I0930 03:07:24.325491 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58b65f8cf-bxrwn"] Sep 30 03:07:24 crc kubenswrapper[4744]: W0930 03:07:24.327029 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13cf6306_d7c7_4bf3_b076_36d79226109d.slice/crio-b97f20f3a098efbcb4e937352ae7fd96f39448354d6fab749740970d71994b9d WatchSource:0}: Error finding container b97f20f3a098efbcb4e937352ae7fd96f39448354d6fab749740970d71994b9d: Status 404 returned error can't find the container with id b97f20f3a098efbcb4e937352ae7fd96f39448354d6fab749740970d71994b9d Sep 30 03:07:24 crc kubenswrapper[4744]: I0930 03:07:24.583792 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" event={"ID":"13cf6306-d7c7-4bf3-b076-36d79226109d","Type":"ContainerStarted","Data":"b97f20f3a098efbcb4e937352ae7fd96f39448354d6fab749740970d71994b9d"} Sep 30 03:07:24 crc kubenswrapper[4744]: I0930 03:07:24.585933 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" event={"ID":"8bf011fe-7512-403a-a247-91e0e01ecaf9","Type":"ContainerStarted","Data":"3e4c1680199e4ef41f1a02f7726007966cd99fb8a6e0bacfb2b756d929d8f4f9"} Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.511434 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7426d8-1f6e-4dfc-b3b4-daf337153ce9" path="/var/lib/kubelet/pods/8b7426d8-1f6e-4dfc-b3b4-daf337153ce9/volumes" Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.512576 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0918c73-ef87-42c8-8395-9499c5a91e2b" path="/var/lib/kubelet/pods/e0918c73-ef87-42c8-8395-9499c5a91e2b/volumes" Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.595779 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" event={"ID":"8bf011fe-7512-403a-a247-91e0e01ecaf9","Type":"ContainerStarted","Data":"9beb85b0837ffc6e53136220bdaf96287b7a0893453bb4b46ce04faf53546f05"} Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.596002 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.597550 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" event={"ID":"13cf6306-d7c7-4bf3-b076-36d79226109d","Type":"ContainerStarted","Data":"b3a84bc71f1a478248ec25d9054478c1fba19e0fbdab8d7c30b109d86b360474"} Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.597926 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.602701 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.604966 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.614472 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79bf699b9c-qbfpl" podStartSLOduration=2.614458301 podStartE2EDuration="2.614458301s" podCreationTimestamp="2025-09-30 03:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:07:25.611392165 +0000 UTC m=+772.784612139" watchObservedRunningTime="2025-09-30 03:07:25.614458301 +0000 UTC m=+772.787678275" Sep 30 03:07:25 crc kubenswrapper[4744]: I0930 03:07:25.661434 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58b65f8cf-bxrwn" podStartSLOduration=2.6614187510000002 podStartE2EDuration="2.661418751s" podCreationTimestamp="2025-09-30 03:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:07:25.659813191 +0000 UTC m=+772.833033165" watchObservedRunningTime="2025-09-30 03:07:25.661418751 +0000 UTC m=+772.834638725" Sep 30 03:07:30 crc kubenswrapper[4744]: I0930 03:07:30.892303 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cd96r"] Sep 30 03:07:30 crc kubenswrapper[4744]: I0930 03:07:30.893932 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:30 crc kubenswrapper[4744]: I0930 03:07:30.908855 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cd96r"] Sep 30 03:07:30 crc kubenswrapper[4744]: I0930 03:07:30.962430 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhht\" (UniqueName: \"kubernetes.io/projected/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-kube-api-access-vxhht\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:30 crc kubenswrapper[4744]: I0930 03:07:30.962491 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-utilities\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:30 crc kubenswrapper[4744]: I0930 03:07:30.962689 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-catalog-content\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.063943 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-catalog-content\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.064028 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhht\" (UniqueName: \"kubernetes.io/projected/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-kube-api-access-vxhht\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.064071 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-utilities\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.064745 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-utilities\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.064880 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-catalog-content\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.092773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhht\" (UniqueName: \"kubernetes.io/projected/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-kube-api-access-vxhht\") pod \"redhat-marketplace-cd96r\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.255544 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.398219 4744 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 03:07:31 crc kubenswrapper[4744]: I0930 03:07:31.728584 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cd96r"] Sep 30 03:07:32 crc kubenswrapper[4744]: I0930 03:07:32.649104 4744 generic.go:334] "Generic (PLEG): container finished" podID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerID="a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9" exitCode=0 Sep 30 03:07:32 crc kubenswrapper[4744]: I0930 03:07:32.649253 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cd96r" event={"ID":"2fe27dc6-e14d-4942-b6e5-6f09d06dd051","Type":"ContainerDied","Data":"a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9"} Sep 30 03:07:32 crc kubenswrapper[4744]: I0930 03:07:32.649524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cd96r" event={"ID":"2fe27dc6-e14d-4942-b6e5-6f09d06dd051","Type":"ContainerStarted","Data":"fb88a0054df54d35621a5c65804d1ccfa080ca5b73f08e6e5fbf60aef8e6dc3c"} Sep 30 03:07:33 crc kubenswrapper[4744]: E0930 03:07:33.984279 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fe27dc6_e14d_4942_b6e5_6f09d06dd051.slice/crio-460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fe27dc6_e14d_4942_b6e5_6f09d06dd051.slice/crio-conmon-460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb.scope\": RecentStats: unable to find data in memory cache]" Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.348535 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.348607 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.348669 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.349751 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30f20d65f55e83fb7df6fb2f203d982a107f210e9c52e670591915139c564a0e"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.349880 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://30f20d65f55e83fb7df6fb2f203d982a107f210e9c52e670591915139c564a0e" gracePeriod=600 Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.671278 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="30f20d65f55e83fb7df6fb2f203d982a107f210e9c52e670591915139c564a0e" exitCode=0 Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.671350 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"30f20d65f55e83fb7df6fb2f203d982a107f210e9c52e670591915139c564a0e"} Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.671737 4744 scope.go:117] "RemoveContainer" containerID="1b74a2cf6f2555d1fa0644ad61981edd38de911946c34219f0d3be9f495b0022" Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.674883 4744 generic.go:334] "Generic (PLEG): container finished" podID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerID="460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb" exitCode=0 Sep 30 03:07:34 crc kubenswrapper[4744]: I0930 03:07:34.674919 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cd96r" event={"ID":"2fe27dc6-e14d-4942-b6e5-6f09d06dd051","Type":"ContainerDied","Data":"460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb"} Sep 30 03:07:35 crc kubenswrapper[4744]: I0930 03:07:35.687142 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"8a5c6bc379bf988ae0369b42f93fd361d89694e20343a5b27933e4ef1594e651"} Sep 30 03:07:35 crc kubenswrapper[4744]: I0930 03:07:35.694906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cd96r" event={"ID":"2fe27dc6-e14d-4942-b6e5-6f09d06dd051","Type":"ContainerStarted","Data":"5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f"} Sep 30 03:07:35 crc kubenswrapper[4744]: I0930 03:07:35.729968 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cd96r" podStartSLOduration=3.185870376 podStartE2EDuration="5.729803265s" podCreationTimestamp="2025-09-30 03:07:30 +0000 UTC" firstStartedPulling="2025-09-30 03:07:32.652317084 +0000 UTC m=+779.825537058" lastFinishedPulling="2025-09-30 03:07:35.196249973 +0000 UTC m=+782.369469947" observedRunningTime="2025-09-30 03:07:35.729770504 +0000 UTC m=+782.902990488" watchObservedRunningTime="2025-09-30 03:07:35.729803265 +0000 UTC m=+782.903023279" Sep 30 03:07:39 crc kubenswrapper[4744]: I0930 03:07:39.649820 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7gvsx" podUID="dd70937c-9e84-468b-b81f-b9f400436aec" containerName="console" containerID="cri-o://ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531" gracePeriod=15 Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.280902 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7gvsx_dd70937c-9e84-468b-b81f-b9f400436aec/console/0.log" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.281444 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.398366 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-serving-cert\") pod \"dd70937c-9e84-468b-b81f-b9f400436aec\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.399963 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-trusted-ca-bundle\") pod \"dd70937c-9e84-468b-b81f-b9f400436aec\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.400146 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pf7\" (UniqueName: \"kubernetes.io/projected/dd70937c-9e84-468b-b81f-b9f400436aec-kube-api-access-67pf7\") pod \"dd70937c-9e84-468b-b81f-b9f400436aec\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.400204 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-oauth-config\") pod \"dd70937c-9e84-468b-b81f-b9f400436aec\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.400255 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-oauth-serving-cert\") pod \"dd70937c-9e84-468b-b81f-b9f400436aec\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.400327 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-service-ca\") pod \"dd70937c-9e84-468b-b81f-b9f400436aec\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.400408 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-console-config\") pod \"dd70937c-9e84-468b-b81f-b9f400436aec\" (UID: \"dd70937c-9e84-468b-b81f-b9f400436aec\") " Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.401060 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd70937c-9e84-468b-b81f-b9f400436aec" (UID: "dd70937c-9e84-468b-b81f-b9f400436aec"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.401264 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd70937c-9e84-468b-b81f-b9f400436aec" (UID: "dd70937c-9e84-468b-b81f-b9f400436aec"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.401277 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd70937c-9e84-468b-b81f-b9f400436aec" (UID: "dd70937c-9e84-468b-b81f-b9f400436aec"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.401528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-console-config" (OuterVolumeSpecName: "console-config") pod "dd70937c-9e84-468b-b81f-b9f400436aec" (UID: "dd70937c-9e84-468b-b81f-b9f400436aec"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.406974 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd70937c-9e84-468b-b81f-b9f400436aec" (UID: "dd70937c-9e84-468b-b81f-b9f400436aec"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.407549 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd70937c-9e84-468b-b81f-b9f400436aec" (UID: "dd70937c-9e84-468b-b81f-b9f400436aec"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.408689 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd70937c-9e84-468b-b81f-b9f400436aec-kube-api-access-67pf7" (OuterVolumeSpecName: "kube-api-access-67pf7") pod "dd70937c-9e84-468b-b81f-b9f400436aec" (UID: "dd70937c-9e84-468b-b81f-b9f400436aec"). InnerVolumeSpecName "kube-api-access-67pf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.502498 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pf7\" (UniqueName: \"kubernetes.io/projected/dd70937c-9e84-468b-b81f-b9f400436aec-kube-api-access-67pf7\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.502550 4744 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.502571 4744 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.502598 4744 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.502616 4744 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.502634 4744 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd70937c-9e84-468b-b81f-b9f400436aec-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.502651 4744 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd70937c-9e84-468b-b81f-b9f400436aec-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.741111 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7gvsx_dd70937c-9e84-468b-b81f-b9f400436aec/console/0.log" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.741162 4744 generic.go:334] "Generic (PLEG): container finished" podID="dd70937c-9e84-468b-b81f-b9f400436aec" containerID="ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531" exitCode=2 Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.741255 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7gvsx" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.742628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7gvsx" event={"ID":"dd70937c-9e84-468b-b81f-b9f400436aec","Type":"ContainerDied","Data":"ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531"} Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.743002 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7gvsx" event={"ID":"dd70937c-9e84-468b-b81f-b9f400436aec","Type":"ContainerDied","Data":"ddf9dc7506d8440af88006361f8b25d192e53101d18c7ca62fa4466be6202c9c"} Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.743054 4744 scope.go:117] "RemoveContainer" containerID="ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.770757 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7gvsx"] Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.777119 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7gvsx"] Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.777603 4744 scope.go:117] "RemoveContainer" containerID="ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531" Sep 30 03:07:40 crc kubenswrapper[4744]: E0930 03:07:40.778190 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531\": container with ID starting with ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531 not found: ID does not exist" containerID="ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531" Sep 30 03:07:40 crc kubenswrapper[4744]: I0930 03:07:40.778245 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531"} err="failed to get container status \"ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531\": rpc error: code = NotFound desc = could not find container \"ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531\": container with ID starting with ad66e1367868b6d126e05332513c676c63be126f94795c8e4578a726deca9531 not found: ID does not exist" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.256662 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.256711 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.326666 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.379486 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4"] Sep 30 03:07:41 crc kubenswrapper[4744]: E0930 03:07:41.379746 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd70937c-9e84-468b-b81f-b9f400436aec" containerName="console" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.379760 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd70937c-9e84-468b-b81f-b9f400436aec" containerName="console" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.379915 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd70937c-9e84-468b-b81f-b9f400436aec" containerName="console" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.380871 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.384867 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.388779 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4"] Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.415573 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.415622 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.415653 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276jt\" (UniqueName: \"kubernetes.io/projected/9f265019-d7fa-4768-95f1-aeefab156c9c-kube-api-access-276jt\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.509623 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd70937c-9e84-468b-b81f-b9f400436aec" path="/var/lib/kubelet/pods/dd70937c-9e84-468b-b81f-b9f400436aec/volumes" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.516911 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.516949 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.516978 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276jt\" (UniqueName: \"kubernetes.io/projected/9f265019-d7fa-4768-95f1-aeefab156c9c-kube-api-access-276jt\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.517585 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.517654 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.537877 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276jt\" (UniqueName: \"kubernetes.io/projected/9f265019-d7fa-4768-95f1-aeefab156c9c-kube-api-access-276jt\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.709766 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:41 crc kubenswrapper[4744]: I0930 03:07:41.816771 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:42 crc kubenswrapper[4744]: I0930 03:07:42.179177 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4"] Sep 30 03:07:42 crc kubenswrapper[4744]: I0930 03:07:42.759907 4744 generic.go:334] "Generic (PLEG): container finished" podID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerID="d7b7b83fa63b9e2cf22b597efd35b8cdc5a2aad6b53cfb0dd74f6708741ed5ec" exitCode=0 Sep 30 03:07:42 crc kubenswrapper[4744]: I0930 03:07:42.759972 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" event={"ID":"9f265019-d7fa-4768-95f1-aeefab156c9c","Type":"ContainerDied","Data":"d7b7b83fa63b9e2cf22b597efd35b8cdc5a2aad6b53cfb0dd74f6708741ed5ec"} Sep 30 03:07:42 crc kubenswrapper[4744]: I0930 03:07:42.760469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" event={"ID":"9f265019-d7fa-4768-95f1-aeefab156c9c","Type":"ContainerStarted","Data":"1bdb2e3d4d10d096bda4e3006089b0c40c6b43fd5a5e35eceb458c8bcec56207"} Sep 30 03:07:43 crc kubenswrapper[4744]: I0930 03:07:43.905756 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4lzch"] Sep 30 03:07:43 crc kubenswrapper[4744]: I0930 03:07:43.910058 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:43 crc kubenswrapper[4744]: I0930 03:07:43.925858 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4lzch"] Sep 30 03:07:43 crc kubenswrapper[4744]: I0930 03:07:43.947980 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-utilities\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:43 crc kubenswrapper[4744]: I0930 03:07:43.948051 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-kube-api-access-hs78b\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:43 crc kubenswrapper[4744]: I0930 03:07:43.948115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-catalog-content\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.049029 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-catalog-content\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.049158 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-utilities\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.049184 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-kube-api-access-hs78b\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.050791 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-utilities\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.050812 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-catalog-content\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.070686 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-kube-api-access-hs78b\") pod \"redhat-operators-4lzch\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.235091 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.632238 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4lzch"] Sep 30 03:07:44 crc kubenswrapper[4744]: W0930 03:07:44.645272 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e3af16_b06e_4c80_bae1_9cd7d18cdbb1.slice/crio-fdd04fc05ec331038a9d89d99c7a130f7fab897066c9510d4606426a1521002e WatchSource:0}: Error finding container fdd04fc05ec331038a9d89d99c7a130f7fab897066c9510d4606426a1521002e: Status 404 returned error can't find the container with id fdd04fc05ec331038a9d89d99c7a130f7fab897066c9510d4606426a1521002e Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.778539 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lzch" event={"ID":"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1","Type":"ContainerStarted","Data":"fdd04fc05ec331038a9d89d99c7a130f7fab897066c9510d4606426a1521002e"} Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.780937 4744 generic.go:334] "Generic (PLEG): container finished" podID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerID="ee0a83c2fe22658122eb5831ab8c5602f5513ce276870dac66fc341e6ad556c7" exitCode=0 Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.780971 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" event={"ID":"9f265019-d7fa-4768-95f1-aeefab156c9c","Type":"ContainerDied","Data":"ee0a83c2fe22658122eb5831ab8c5602f5513ce276870dac66fc341e6ad556c7"} Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.880669 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cd96r"] Sep 30 03:07:44 crc kubenswrapper[4744]: I0930 03:07:44.880994 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cd96r" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="registry-server" containerID="cri-o://5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f" gracePeriod=2 Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.354077 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.464312 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-utilities\") pod \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.464460 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-catalog-content\") pod \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.464526 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxhht\" (UniqueName: \"kubernetes.io/projected/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-kube-api-access-vxhht\") pod \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\" (UID: \"2fe27dc6-e14d-4942-b6e5-6f09d06dd051\") " Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.466023 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-utilities" (OuterVolumeSpecName: "utilities") pod "2fe27dc6-e14d-4942-b6e5-6f09d06dd051" (UID: "2fe27dc6-e14d-4942-b6e5-6f09d06dd051"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.473470 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-kube-api-access-vxhht" (OuterVolumeSpecName: "kube-api-access-vxhht") pod "2fe27dc6-e14d-4942-b6e5-6f09d06dd051" (UID: "2fe27dc6-e14d-4942-b6e5-6f09d06dd051"). InnerVolumeSpecName "kube-api-access-vxhht". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.493133 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fe27dc6-e14d-4942-b6e5-6f09d06dd051" (UID: "2fe27dc6-e14d-4942-b6e5-6f09d06dd051"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.567081 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.567132 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.567151 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxhht\" (UniqueName: \"kubernetes.io/projected/2fe27dc6-e14d-4942-b6e5-6f09d06dd051-kube-api-access-vxhht\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.790081 4744 generic.go:334] "Generic (PLEG): container finished" podID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerID="c4706c626e21d96f023afb8b65ce3c764fcb14db2443a6abeb2c0fd9729b285c" exitCode=0 Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.790253 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" event={"ID":"9f265019-d7fa-4768-95f1-aeefab156c9c","Type":"ContainerDied","Data":"c4706c626e21d96f023afb8b65ce3c764fcb14db2443a6abeb2c0fd9729b285c"} Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.794040 4744 generic.go:334] "Generic (PLEG): container finished" podID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerID="5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f" exitCode=0 Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.794124 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cd96r" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.794110 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cd96r" event={"ID":"2fe27dc6-e14d-4942-b6e5-6f09d06dd051","Type":"ContainerDied","Data":"5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f"} Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.794275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cd96r" event={"ID":"2fe27dc6-e14d-4942-b6e5-6f09d06dd051","Type":"ContainerDied","Data":"fb88a0054df54d35621a5c65804d1ccfa080ca5b73f08e6e5fbf60aef8e6dc3c"} Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.794429 4744 scope.go:117] "RemoveContainer" containerID="5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.797361 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lzch" event={"ID":"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1","Type":"ContainerDied","Data":"4e5d0d5fef3e4f1fcf0de3bb0e095d24ff04ea33b9dabfbeb79349159a0d0565"} Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.797166 4744 generic.go:334] "Generic (PLEG): container finished" podID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerID="4e5d0d5fef3e4f1fcf0de3bb0e095d24ff04ea33b9dabfbeb79349159a0d0565" exitCode=0 Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.834678 4744 scope.go:117] "RemoveContainer" containerID="460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.841512 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cd96r"] Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.845539 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cd96r"] Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.861517 4744 scope.go:117] "RemoveContainer" containerID="a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.898808 4744 scope.go:117] "RemoveContainer" containerID="5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f" Sep 30 03:07:45 crc kubenswrapper[4744]: E0930 03:07:45.899518 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f\": container with ID starting with 5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f not found: ID does not exist" containerID="5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.899607 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f"} err="failed to get container status \"5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f\": rpc error: code = NotFound desc = could not find container \"5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f\": container with ID starting with 5cdce3c6c5fba2f7bdfd3693788ac929b8d719acdae99033340ad0428ca3376f not found: ID does not exist" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.899644 4744 scope.go:117] "RemoveContainer" containerID="460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb" Sep 30 03:07:45 crc kubenswrapper[4744]: E0930 03:07:45.900231 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb\": container with ID starting with 460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb not found: ID does not exist" containerID="460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.900295 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb"} err="failed to get container status \"460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb\": rpc error: code = NotFound desc = could not find container \"460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb\": container with ID starting with 460b3ee3fb73abcd69f1670ceb2ea3a1c1359fcad08325f55bfe20539d822ceb not found: ID does not exist" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.900339 4744 scope.go:117] "RemoveContainer" containerID="a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9" Sep 30 03:07:45 crc kubenswrapper[4744]: E0930 03:07:45.901015 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9\": container with ID starting with a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9 not found: ID does not exist" containerID="a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9" Sep 30 03:07:45 crc kubenswrapper[4744]: I0930 03:07:45.901060 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9"} err="failed to get container status \"a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9\": rpc error: code = NotFound desc = could not find container \"a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9\": container with ID starting with a81f01eaa90d433802c8613038715259501d4b71541624dbe5304f5f449934d9 not found: ID does not exist" Sep 30 03:07:46 crc kubenswrapper[4744]: I0930 03:07:46.809449 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lzch" event={"ID":"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1","Type":"ContainerStarted","Data":"b2097a6dff1368e4988b9c3fd880635dbe95bd3470bc2ec7fe7ad537a6802c7c"} Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.226628 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.292382 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-util\") pod \"9f265019-d7fa-4768-95f1-aeefab156c9c\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.292419 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-bundle\") pod \"9f265019-d7fa-4768-95f1-aeefab156c9c\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.292483 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276jt\" (UniqueName: \"kubernetes.io/projected/9f265019-d7fa-4768-95f1-aeefab156c9c-kube-api-access-276jt\") pod \"9f265019-d7fa-4768-95f1-aeefab156c9c\" (UID: \"9f265019-d7fa-4768-95f1-aeefab156c9c\") " Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.294746 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-bundle" (OuterVolumeSpecName: "bundle") pod "9f265019-d7fa-4768-95f1-aeefab156c9c" (UID: "9f265019-d7fa-4768-95f1-aeefab156c9c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.297633 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f265019-d7fa-4768-95f1-aeefab156c9c-kube-api-access-276jt" (OuterVolumeSpecName: "kube-api-access-276jt") pod "9f265019-d7fa-4768-95f1-aeefab156c9c" (UID: "9f265019-d7fa-4768-95f1-aeefab156c9c"). InnerVolumeSpecName "kube-api-access-276jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.309345 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-util" (OuterVolumeSpecName: "util") pod "9f265019-d7fa-4768-95f1-aeefab156c9c" (UID: "9f265019-d7fa-4768-95f1-aeefab156c9c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.394387 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276jt\" (UniqueName: \"kubernetes.io/projected/9f265019-d7fa-4768-95f1-aeefab156c9c-kube-api-access-276jt\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.394416 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-util\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.394428 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f265019-d7fa-4768-95f1-aeefab156c9c-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.519736 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" path="/var/lib/kubelet/pods/2fe27dc6-e14d-4942-b6e5-6f09d06dd051/volumes" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.822892 4744 generic.go:334] "Generic (PLEG): container finished" podID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerID="b2097a6dff1368e4988b9c3fd880635dbe95bd3470bc2ec7fe7ad537a6802c7c" exitCode=0 Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.823089 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lzch" event={"ID":"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1","Type":"ContainerDied","Data":"b2097a6dff1368e4988b9c3fd880635dbe95bd3470bc2ec7fe7ad537a6802c7c"} Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.830177 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" event={"ID":"9f265019-d7fa-4768-95f1-aeefab156c9c","Type":"ContainerDied","Data":"1bdb2e3d4d10d096bda4e3006089b0c40c6b43fd5a5e35eceb458c8bcec56207"} Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.830246 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdb2e3d4d10d096bda4e3006089b0c40c6b43fd5a5e35eceb458c8bcec56207" Sep 30 03:07:47 crc kubenswrapper[4744]: I0930 03:07:47.830266 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4" Sep 30 03:07:48 crc kubenswrapper[4744]: I0930 03:07:48.840064 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lzch" event={"ID":"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1","Type":"ContainerStarted","Data":"01e1cfa52bcbdeedd1f0612c7d5ff98e95cafb6b0689bc34899a7628beb623b8"} Sep 30 03:07:48 crc kubenswrapper[4744]: I0930 03:07:48.868777 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4lzch" podStartSLOduration=3.167474221 podStartE2EDuration="5.86874923s" podCreationTimestamp="2025-09-30 03:07:43 +0000 UTC" firstStartedPulling="2025-09-30 03:07:45.79961794 +0000 UTC m=+792.972837954" lastFinishedPulling="2025-09-30 03:07:48.500892959 +0000 UTC m=+795.674112963" observedRunningTime="2025-09-30 03:07:48.865169739 +0000 UTC m=+796.038389763" watchObservedRunningTime="2025-09-30 03:07:48.86874923 +0000 UTC m=+796.041969244" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.494736 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8pgrk"] Sep 30 03:07:50 crc kubenswrapper[4744]: E0930 03:07:50.495060 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerName="extract" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495081 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerName="extract" Sep 30 03:07:50 crc kubenswrapper[4744]: E0930 03:07:50.495106 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="registry-server" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495119 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="registry-server" Sep 30 03:07:50 crc kubenswrapper[4744]: E0930 03:07:50.495140 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerName="pull" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495154 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerName="pull" Sep 30 03:07:50 crc kubenswrapper[4744]: E0930 03:07:50.495172 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerName="util" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495185 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerName="util" Sep 30 03:07:50 crc kubenswrapper[4744]: E0930 03:07:50.495200 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="extract-utilities" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495213 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="extract-utilities" Sep 30 03:07:50 crc kubenswrapper[4744]: E0930 03:07:50.495239 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="extract-content" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495251 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="extract-content" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495468 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe27dc6-e14d-4942-b6e5-6f09d06dd051" containerName="registry-server" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.495492 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f265019-d7fa-4768-95f1-aeefab156c9c" containerName="extract" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.497011 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.516985 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pgrk"] Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.643720 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-utilities\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.643815 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-catalog-content\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.643962 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw757\" (UniqueName: \"kubernetes.io/projected/c1fe0a74-7657-4f93-b46c-ed598fb9c295-kube-api-access-pw757\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.745812 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-utilities\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.745909 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-catalog-content\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.745968 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw757\" (UniqueName: \"kubernetes.io/projected/c1fe0a74-7657-4f93-b46c-ed598fb9c295-kube-api-access-pw757\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.746438 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-utilities\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.746658 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-catalog-content\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.777888 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw757\" (UniqueName: \"kubernetes.io/projected/c1fe0a74-7657-4f93-b46c-ed598fb9c295-kube-api-access-pw757\") pod \"certified-operators-8pgrk\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:50 crc kubenswrapper[4744]: I0930 03:07:50.823218 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:07:51 crc kubenswrapper[4744]: I0930 03:07:51.296213 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pgrk"] Sep 30 03:07:51 crc kubenswrapper[4744]: I0930 03:07:51.857906 4744 generic.go:334] "Generic (PLEG): container finished" podID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerID="9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8" exitCode=0 Sep 30 03:07:51 crc kubenswrapper[4744]: I0930 03:07:51.857946 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgrk" event={"ID":"c1fe0a74-7657-4f93-b46c-ed598fb9c295","Type":"ContainerDied","Data":"9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8"} Sep 30 03:07:51 crc kubenswrapper[4744]: I0930 03:07:51.857970 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgrk" event={"ID":"c1fe0a74-7657-4f93-b46c-ed598fb9c295","Type":"ContainerStarted","Data":"7ce8c04d18dbf7df2f749800d3a2b6b75e37e03aa13255bcb790eb381534be7f"} Sep 30 03:07:53 crc kubenswrapper[4744]: I0930 03:07:53.870449 4744 generic.go:334] "Generic (PLEG): container finished" podID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerID="ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2" exitCode=0 Sep 30 03:07:53 crc kubenswrapper[4744]: I0930 03:07:53.870539 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgrk" event={"ID":"c1fe0a74-7657-4f93-b46c-ed598fb9c295","Type":"ContainerDied","Data":"ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2"} Sep 30 03:07:54 crc kubenswrapper[4744]: I0930 03:07:54.235448 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:54 crc kubenswrapper[4744]: I0930 03:07:54.235784 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:54 crc kubenswrapper[4744]: I0930 03:07:54.304832 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:54 crc kubenswrapper[4744]: I0930 03:07:54.878740 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgrk" event={"ID":"c1fe0a74-7657-4f93-b46c-ed598fb9c295","Type":"ContainerStarted","Data":"aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364"} Sep 30 03:07:54 crc kubenswrapper[4744]: I0930 03:07:54.900055 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8pgrk" podStartSLOduration=2.4040973389999998 podStartE2EDuration="4.900030087s" podCreationTimestamp="2025-09-30 03:07:50 +0000 UTC" firstStartedPulling="2025-09-30 03:07:51.872359885 +0000 UTC m=+799.045579849" lastFinishedPulling="2025-09-30 03:07:54.368292613 +0000 UTC m=+801.541512597" observedRunningTime="2025-09-30 03:07:54.894872577 +0000 UTC m=+802.068092561" watchObservedRunningTime="2025-09-30 03:07:54.900030087 +0000 UTC m=+802.073250081" Sep 30 03:07:54 crc kubenswrapper[4744]: I0930 03:07:54.954459 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:56 crc kubenswrapper[4744]: I0930 03:07:56.881164 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4lzch"] Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.118531 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc"] Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.119886 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.125781 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.125863 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4jmnh" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.125865 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.125982 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.125991 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.139900 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc"] Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.140105 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkvgw\" (UniqueName: \"kubernetes.io/projected/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-kube-api-access-tkvgw\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.140241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-apiservice-cert\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.140287 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-webhook-cert\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.241522 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkvgw\" (UniqueName: \"kubernetes.io/projected/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-kube-api-access-tkvgw\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.241594 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-apiservice-cert\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.241635 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-webhook-cert\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.247169 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-apiservice-cert\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.247227 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-webhook-cert\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.259275 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkvgw\" (UniqueName: \"kubernetes.io/projected/3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e-kube-api-access-tkvgw\") pod \"metallb-operator-controller-manager-58d4cc4478-pt5sc\" (UID: \"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e\") " pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.338900 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t"] Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.339711 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.342124 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.342662 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7tg\" (UniqueName: \"kubernetes.io/projected/cc149279-3823-4588-a559-a348efdb9bcd-kube-api-access-dc7tg\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.342746 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc149279-3823-4588-a559-a348efdb9bcd-webhook-cert\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.342943 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc149279-3823-4588-a559-a348efdb9bcd-apiservice-cert\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.345396 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.345846 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t"] Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.347934 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jtnmm" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.441020 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.451637 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc149279-3823-4588-a559-a348efdb9bcd-apiservice-cert\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.451694 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7tg\" (UniqueName: \"kubernetes.io/projected/cc149279-3823-4588-a559-a348efdb9bcd-kube-api-access-dc7tg\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.451740 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc149279-3823-4588-a559-a348efdb9bcd-webhook-cert\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.471020 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc149279-3823-4588-a559-a348efdb9bcd-webhook-cert\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.493836 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc149279-3823-4588-a559-a348efdb9bcd-apiservice-cert\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.498006 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7tg\" (UniqueName: \"kubernetes.io/projected/cc149279-3823-4588-a559-a348efdb9bcd-kube-api-access-dc7tg\") pod \"metallb-operator-webhook-server-67f6c5dc78-c7h6t\" (UID: \"cc149279-3823-4588-a559-a348efdb9bcd\") " pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.653399 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.893755 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4lzch" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="registry-server" containerID="cri-o://01e1cfa52bcbdeedd1f0612c7d5ff98e95cafb6b0689bc34899a7628beb623b8" gracePeriod=2 Sep 30 03:07:57 crc kubenswrapper[4744]: I0930 03:07:57.941491 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc"] Sep 30 03:07:57 crc kubenswrapper[4744]: W0930 03:07:57.946776 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d8f001e_888e_4f58_b3c6_6b8b0fddaf3e.slice/crio-284d9e76c746dbb66cc618beca788008381ef47689473b7e95aa3b6610579337 WatchSource:0}: Error finding container 284d9e76c746dbb66cc618beca788008381ef47689473b7e95aa3b6610579337: Status 404 returned error can't find the container with id 284d9e76c746dbb66cc618beca788008381ef47689473b7e95aa3b6610579337 Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.062779 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t"] Sep 30 03:07:58 crc kubenswrapper[4744]: W0930 03:07:58.067353 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc149279_3823_4588_a559_a348efdb9bcd.slice/crio-ce6298b00e8e11de78c13bdaba1d02e38e6a1460e10b13278bea27af7b46faf1 WatchSource:0}: Error finding container ce6298b00e8e11de78c13bdaba1d02e38e6a1460e10b13278bea27af7b46faf1: Status 404 returned error can't find the container with id ce6298b00e8e11de78c13bdaba1d02e38e6a1460e10b13278bea27af7b46faf1 Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.910184 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" event={"ID":"cc149279-3823-4588-a559-a348efdb9bcd","Type":"ContainerStarted","Data":"ce6298b00e8e11de78c13bdaba1d02e38e6a1460e10b13278bea27af7b46faf1"} Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.913601 4744 generic.go:334] "Generic (PLEG): container finished" podID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerID="01e1cfa52bcbdeedd1f0612c7d5ff98e95cafb6b0689bc34899a7628beb623b8" exitCode=0 Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.913705 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lzch" event={"ID":"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1","Type":"ContainerDied","Data":"01e1cfa52bcbdeedd1f0612c7d5ff98e95cafb6b0689bc34899a7628beb623b8"} Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.913740 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lzch" event={"ID":"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1","Type":"ContainerDied","Data":"fdd04fc05ec331038a9d89d99c7a130f7fab897066c9510d4606426a1521002e"} Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.913755 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd04fc05ec331038a9d89d99c7a130f7fab897066c9510d4606426a1521002e" Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.914589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" event={"ID":"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e","Type":"ContainerStarted","Data":"284d9e76c746dbb66cc618beca788008381ef47689473b7e95aa3b6610579337"} Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.935560 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.975543 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-kube-api-access-hs78b\") pod \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.975588 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-catalog-content\") pod \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " Sep 30 03:07:58 crc kubenswrapper[4744]: I0930 03:07:58.981851 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-kube-api-access-hs78b" (OuterVolumeSpecName: "kube-api-access-hs78b") pod "e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" (UID: "e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1"). InnerVolumeSpecName "kube-api-access-hs78b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.062931 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" (UID: "e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.088802 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-utilities\") pod \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\" (UID: \"e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1\") " Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.089827 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-utilities" (OuterVolumeSpecName: "utilities") pod "e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" (UID: "e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.090071 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.090089 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-kube-api-access-hs78b\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.090102 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.920021 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lzch" Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.938898 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4lzch"] Sep 30 03:07:59 crc kubenswrapper[4744]: I0930 03:07:59.942031 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4lzch"] Sep 30 03:08:00 crc kubenswrapper[4744]: I0930 03:08:00.824098 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:08:00 crc kubenswrapper[4744]: I0930 03:08:00.824138 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:08:00 crc kubenswrapper[4744]: I0930 03:08:00.875216 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:08:00 crc kubenswrapper[4744]: I0930 03:08:00.969991 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:08:01 crc kubenswrapper[4744]: I0930 03:08:01.535357 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" path="/var/lib/kubelet/pods/e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1/volumes" Sep 30 03:08:02 crc kubenswrapper[4744]: I0930 03:08:02.945625 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" event={"ID":"3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e","Type":"ContainerStarted","Data":"0ff8f813fd20aa241b0ad69a8df8cd257cd8ffaa79d0cbda594f5fcacda3217f"} Sep 30 03:08:02 crc kubenswrapper[4744]: I0930 03:08:02.946029 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:08:02 crc kubenswrapper[4744]: I0930 03:08:02.948846 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" event={"ID":"cc149279-3823-4588-a559-a348efdb9bcd","Type":"ContainerStarted","Data":"8788a1b5b50e37566a190d185c30d14857c6415138df1cf17352c60bacae5d9b"} Sep 30 03:08:02 crc kubenswrapper[4744]: I0930 03:08:02.949080 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:08:03 crc kubenswrapper[4744]: I0930 03:08:03.002171 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" podStartSLOduration=1.297196322 podStartE2EDuration="6.002140231s" podCreationTimestamp="2025-09-30 03:07:57 +0000 UTC" firstStartedPulling="2025-09-30 03:07:57.949966022 +0000 UTC m=+805.123186006" lastFinishedPulling="2025-09-30 03:08:02.654909941 +0000 UTC m=+809.828129915" observedRunningTime="2025-09-30 03:08:02.975001498 +0000 UTC m=+810.148221472" watchObservedRunningTime="2025-09-30 03:08:03.002140231 +0000 UTC m=+810.175360215" Sep 30 03:08:03 crc kubenswrapper[4744]: I0930 03:08:03.007170 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" podStartSLOduration=1.411383493 podStartE2EDuration="6.007153217s" podCreationTimestamp="2025-09-30 03:07:57 +0000 UTC" firstStartedPulling="2025-09-30 03:07:58.072308315 +0000 UTC m=+805.245528299" lastFinishedPulling="2025-09-30 03:08:02.668078049 +0000 UTC m=+809.841298023" observedRunningTime="2025-09-30 03:08:03.003654478 +0000 UTC m=+810.176874452" watchObservedRunningTime="2025-09-30 03:08:03.007153217 +0000 UTC m=+810.180373201" Sep 30 03:08:05 crc kubenswrapper[4744]: I0930 03:08:05.881572 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pgrk"] Sep 30 03:08:05 crc kubenswrapper[4744]: I0930 03:08:05.882402 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8pgrk" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="registry-server" containerID="cri-o://aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364" gracePeriod=2 Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.372714 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.424664 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-catalog-content\") pod \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.424993 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw757\" (UniqueName: \"kubernetes.io/projected/c1fe0a74-7657-4f93-b46c-ed598fb9c295-kube-api-access-pw757\") pod \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.425063 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-utilities\") pod \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\" (UID: \"c1fe0a74-7657-4f93-b46c-ed598fb9c295\") " Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.425963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-utilities" (OuterVolumeSpecName: "utilities") pod "c1fe0a74-7657-4f93-b46c-ed598fb9c295" (UID: "c1fe0a74-7657-4f93-b46c-ed598fb9c295"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.433683 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1fe0a74-7657-4f93-b46c-ed598fb9c295-kube-api-access-pw757" (OuterVolumeSpecName: "kube-api-access-pw757") pod "c1fe0a74-7657-4f93-b46c-ed598fb9c295" (UID: "c1fe0a74-7657-4f93-b46c-ed598fb9c295"). InnerVolumeSpecName "kube-api-access-pw757". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.482909 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1fe0a74-7657-4f93-b46c-ed598fb9c295" (UID: "c1fe0a74-7657-4f93-b46c-ed598fb9c295"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.527179 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.527206 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw757\" (UniqueName: \"kubernetes.io/projected/c1fe0a74-7657-4f93-b46c-ed598fb9c295-kube-api-access-pw757\") on node \"crc\" DevicePath \"\"" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.527220 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1fe0a74-7657-4f93-b46c-ed598fb9c295-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.896114 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jmdvv"] Sep 30 03:08:06 crc kubenswrapper[4744]: E0930 03:08:06.897011 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="extract-utilities" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897037 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="extract-utilities" Sep 30 03:08:06 crc kubenswrapper[4744]: E0930 03:08:06.897054 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="registry-server" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897068 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="registry-server" Sep 30 03:08:06 crc kubenswrapper[4744]: E0930 03:08:06.897089 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="extract-content" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897102 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="extract-content" Sep 30 03:08:06 crc kubenswrapper[4744]: E0930 03:08:06.897119 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="extract-content" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897130 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="extract-content" Sep 30 03:08:06 crc kubenswrapper[4744]: E0930 03:08:06.897150 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="extract-utilities" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897162 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="extract-utilities" Sep 30 03:08:06 crc kubenswrapper[4744]: E0930 03:08:06.897193 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="registry-server" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897204 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="registry-server" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897404 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e3af16-b06e-4c80-bae1-9cd7d18cdbb1" containerName="registry-server" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.897426 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerName="registry-server" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.898883 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.915702 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmdvv"] Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.975583 4744 generic.go:334] "Generic (PLEG): container finished" podID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" containerID="aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364" exitCode=0 Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.975626 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgrk" event={"ID":"c1fe0a74-7657-4f93-b46c-ed598fb9c295","Type":"ContainerDied","Data":"aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364"} Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.975650 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgrk" event={"ID":"c1fe0a74-7657-4f93-b46c-ed598fb9c295","Type":"ContainerDied","Data":"7ce8c04d18dbf7df2f749800d3a2b6b75e37e03aa13255bcb790eb381534be7f"} Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.975670 4744 scope.go:117] "RemoveContainer" containerID="aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364" Sep 30 03:08:06 crc kubenswrapper[4744]: I0930 03:08:06.975668 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgrk" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.016891 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pgrk"] Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.025207 4744 scope.go:117] "RemoveContainer" containerID="ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.029084 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8pgrk"] Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.034929 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-catalog-content\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.035013 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-utilities\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.035039 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42bh\" (UniqueName: \"kubernetes.io/projected/86196e92-481a-43fa-9e1f-7ea8ec7866fd-kube-api-access-f42bh\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.073278 4744 scope.go:117] "RemoveContainer" containerID="9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.097085 4744 scope.go:117] "RemoveContainer" containerID="aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364" Sep 30 03:08:07 crc kubenswrapper[4744]: E0930 03:08:07.097430 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364\": container with ID starting with aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364 not found: ID does not exist" containerID="aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.097529 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364"} err="failed to get container status \"aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364\": rpc error: code = NotFound desc = could not find container \"aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364\": container with ID starting with aea5d1a5490669324af84a64bb825927632bcf2ac0715ca6732d2ae18617e364 not found: ID does not exist" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.097604 4744 scope.go:117] "RemoveContainer" containerID="ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2" Sep 30 03:08:07 crc kubenswrapper[4744]: E0930 03:08:07.097857 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2\": container with ID starting with ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2 not found: ID does not exist" containerID="ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.097935 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2"} err="failed to get container status \"ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2\": rpc error: code = NotFound desc = could not find container \"ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2\": container with ID starting with ec0dcd4aef04831fa21209458f88ebbebdb46833c1b101bf629f9f1855b99ab2 not found: ID does not exist" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.098012 4744 scope.go:117] "RemoveContainer" containerID="9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8" Sep 30 03:08:07 crc kubenswrapper[4744]: E0930 03:08:07.098300 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8\": container with ID starting with 9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8 not found: ID does not exist" containerID="9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.098338 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8"} err="failed to get container status \"9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8\": rpc error: code = NotFound desc = could not find container \"9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8\": container with ID starting with 9e53e684888577a0536e88a6d82612d2fe18ccf5e5d136fb185400c64ba787c8 not found: ID does not exist" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.136711 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-catalog-content\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.137028 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-utilities\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.137120 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42bh\" (UniqueName: \"kubernetes.io/projected/86196e92-481a-43fa-9e1f-7ea8ec7866fd-kube-api-access-f42bh\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.137408 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-catalog-content\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.137520 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-utilities\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.154351 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42bh\" (UniqueName: \"kubernetes.io/projected/86196e92-481a-43fa-9e1f-7ea8ec7866fd-kube-api-access-f42bh\") pod \"community-operators-jmdvv\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.223549 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.510274 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1fe0a74-7657-4f93-b46c-ed598fb9c295" path="/var/lib/kubelet/pods/c1fe0a74-7657-4f93-b46c-ed598fb9c295/volumes" Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.680404 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jmdvv"] Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.986117 4744 generic.go:334] "Generic (PLEG): container finished" podID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerID="2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78" exitCode=0 Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.986180 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmdvv" event={"ID":"86196e92-481a-43fa-9e1f-7ea8ec7866fd","Type":"ContainerDied","Data":"2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78"} Sep 30 03:08:07 crc kubenswrapper[4744]: I0930 03:08:07.986215 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmdvv" event={"ID":"86196e92-481a-43fa-9e1f-7ea8ec7866fd","Type":"ContainerStarted","Data":"136c9966334e444deebbb87b98b239442315f8e02d2167f19ae8b45165295efd"} Sep 30 03:08:08 crc kubenswrapper[4744]: I0930 03:08:08.992351 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmdvv" event={"ID":"86196e92-481a-43fa-9e1f-7ea8ec7866fd","Type":"ContainerStarted","Data":"46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee"} Sep 30 03:08:10 crc kubenswrapper[4744]: I0930 03:08:10.003866 4744 generic.go:334] "Generic (PLEG): container finished" podID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerID="46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee" exitCode=0 Sep 30 03:08:10 crc kubenswrapper[4744]: I0930 03:08:10.003911 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmdvv" event={"ID":"86196e92-481a-43fa-9e1f-7ea8ec7866fd","Type":"ContainerDied","Data":"46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee"} Sep 30 03:08:11 crc kubenswrapper[4744]: I0930 03:08:11.011733 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmdvv" event={"ID":"86196e92-481a-43fa-9e1f-7ea8ec7866fd","Type":"ContainerStarted","Data":"50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d"} Sep 30 03:08:11 crc kubenswrapper[4744]: I0930 03:08:11.032425 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jmdvv" podStartSLOduration=2.424264737 podStartE2EDuration="5.032361691s" podCreationTimestamp="2025-09-30 03:08:06 +0000 UTC" firstStartedPulling="2025-09-30 03:08:07.988182195 +0000 UTC m=+815.161402169" lastFinishedPulling="2025-09-30 03:08:10.596279139 +0000 UTC m=+817.769499123" observedRunningTime="2025-09-30 03:08:11.02974953 +0000 UTC m=+818.202969514" watchObservedRunningTime="2025-09-30 03:08:11.032361691 +0000 UTC m=+818.205581675" Sep 30 03:08:17 crc kubenswrapper[4744]: I0930 03:08:17.223721 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:17 crc kubenswrapper[4744]: I0930 03:08:17.224049 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:17 crc kubenswrapper[4744]: I0930 03:08:17.259544 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:17 crc kubenswrapper[4744]: I0930 03:08:17.661290 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-67f6c5dc78-c7h6t" Sep 30 03:08:18 crc kubenswrapper[4744]: I0930 03:08:18.134611 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:19 crc kubenswrapper[4744]: I0930 03:08:19.677126 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmdvv"] Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.086285 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jmdvv" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="registry-server" containerID="cri-o://50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d" gracePeriod=2 Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.555060 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.616054 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-catalog-content\") pod \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.616207 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f42bh\" (UniqueName: \"kubernetes.io/projected/86196e92-481a-43fa-9e1f-7ea8ec7866fd-kube-api-access-f42bh\") pod \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.616291 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-utilities\") pod \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\" (UID: \"86196e92-481a-43fa-9e1f-7ea8ec7866fd\") " Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.620125 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-utilities" (OuterVolumeSpecName: "utilities") pod "86196e92-481a-43fa-9e1f-7ea8ec7866fd" (UID: "86196e92-481a-43fa-9e1f-7ea8ec7866fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.626324 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86196e92-481a-43fa-9e1f-7ea8ec7866fd-kube-api-access-f42bh" (OuterVolumeSpecName: "kube-api-access-f42bh") pod "86196e92-481a-43fa-9e1f-7ea8ec7866fd" (UID: "86196e92-481a-43fa-9e1f-7ea8ec7866fd"). InnerVolumeSpecName "kube-api-access-f42bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.694637 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86196e92-481a-43fa-9e1f-7ea8ec7866fd" (UID: "86196e92-481a-43fa-9e1f-7ea8ec7866fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.717689 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.717724 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f42bh\" (UniqueName: \"kubernetes.io/projected/86196e92-481a-43fa-9e1f-7ea8ec7866fd-kube-api-access-f42bh\") on node \"crc\" DevicePath \"\"" Sep 30 03:08:20 crc kubenswrapper[4744]: I0930 03:08:20.717738 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86196e92-481a-43fa-9e1f-7ea8ec7866fd-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.096760 4744 generic.go:334] "Generic (PLEG): container finished" podID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerID="50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d" exitCode=0 Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.096855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmdvv" event={"ID":"86196e92-481a-43fa-9e1f-7ea8ec7866fd","Type":"ContainerDied","Data":"50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d"} Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.097084 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jmdvv" event={"ID":"86196e92-481a-43fa-9e1f-7ea8ec7866fd","Type":"ContainerDied","Data":"136c9966334e444deebbb87b98b239442315f8e02d2167f19ae8b45165295efd"} Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.097110 4744 scope.go:117] "RemoveContainer" containerID="50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.096884 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jmdvv" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.130875 4744 scope.go:117] "RemoveContainer" containerID="46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.141356 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jmdvv"] Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.147586 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jmdvv"] Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.170593 4744 scope.go:117] "RemoveContainer" containerID="2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.200094 4744 scope.go:117] "RemoveContainer" containerID="50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d" Sep 30 03:08:21 crc kubenswrapper[4744]: E0930 03:08:21.200552 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d\": container with ID starting with 50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d not found: ID does not exist" containerID="50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.200590 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d"} err="failed to get container status \"50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d\": rpc error: code = NotFound desc = could not find container \"50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d\": container with ID starting with 50f21e3d3b820ddd0353b172ab21839467b581782be2bebbcb9467042b28f78d not found: ID does not exist" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.200616 4744 scope.go:117] "RemoveContainer" containerID="46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee" Sep 30 03:08:21 crc kubenswrapper[4744]: E0930 03:08:21.200977 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee\": container with ID starting with 46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee not found: ID does not exist" containerID="46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.201042 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee"} err="failed to get container status \"46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee\": rpc error: code = NotFound desc = could not find container \"46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee\": container with ID starting with 46f8e02a8e471bd0e42962bb0b8261ae61b61f654c352f8ea7208204220bb3ee not found: ID does not exist" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.201083 4744 scope.go:117] "RemoveContainer" containerID="2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78" Sep 30 03:08:21 crc kubenswrapper[4744]: E0930 03:08:21.201580 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78\": container with ID starting with 2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78 not found: ID does not exist" containerID="2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.201614 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78"} err="failed to get container status \"2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78\": rpc error: code = NotFound desc = could not find container \"2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78\": container with ID starting with 2c338345a07c8869a420507fee43f90fb23e04da7556d7c8f03cbe186d267c78 not found: ID does not exist" Sep 30 03:08:21 crc kubenswrapper[4744]: I0930 03:08:21.516704 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" path="/var/lib/kubelet/pods/86196e92-481a-43fa-9e1f-7ea8ec7866fd/volumes" Sep 30 03:08:37 crc kubenswrapper[4744]: I0930 03:08:37.444359 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-58d4cc4478-pt5sc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.306381 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t"] Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.306613 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="extract-content" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.306625 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="extract-content" Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.306637 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="registry-server" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.306643 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="registry-server" Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.306653 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="extract-utilities" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.306660 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="extract-utilities" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.306766 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="86196e92-481a-43fa-9e1f-7ea8ec7866fd" containerName="registry-server" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.307147 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.309655 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.312753 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jlw2j" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.315213 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-p7zk7"] Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.317225 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.318979 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.320014 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.333333 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t"] Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.383939 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-reloader\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.383982 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c5228111-8563-4e96-abae-b748a4677ff8-frr-startup\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.384017 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-frr-conf\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.384043 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5nzl\" (UniqueName: \"kubernetes.io/projected/c5228111-8563-4e96-abae-b748a4677ff8-kube-api-access-h5nzl\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.384084 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-frr-sockets\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.384108 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228111-8563-4e96-abae-b748a4677ff8-metrics-certs\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.384253 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-metrics\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.384296 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplhv\" (UniqueName: \"kubernetes.io/projected/416e2fa8-29ae-42c2-a71a-863244e1b5df-kube-api-access-hplhv\") pod \"frr-k8s-webhook-server-5478bdb765-bwg9t\" (UID: \"416e2fa8-29ae-42c2-a71a-863244e1b5df\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.384317 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/416e2fa8-29ae-42c2-a71a-863244e1b5df-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bwg9t\" (UID: \"416e2fa8-29ae-42c2-a71a-863244e1b5df\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.408502 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dthlc"] Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.409548 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.414432 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.414461 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lnwjs" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.415256 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.416570 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.424454 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-jg7r7"] Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.425550 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.429169 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.442359 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-jg7r7"] Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485621 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-metrics\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485680 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplhv\" (UniqueName: \"kubernetes.io/projected/416e2fa8-29ae-42c2-a71a-863244e1b5df-kube-api-access-hplhv\") pod \"frr-k8s-webhook-server-5478bdb765-bwg9t\" (UID: \"416e2fa8-29ae-42c2-a71a-863244e1b5df\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485707 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/416e2fa8-29ae-42c2-a71a-863244e1b5df-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bwg9t\" (UID: \"416e2fa8-29ae-42c2-a71a-863244e1b5df\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485733 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n75dk\" (UniqueName: \"kubernetes.io/projected/15204744-c1cf-4027-8131-fd89b0544638-kube-api-access-n75dk\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-metrics-certs\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485793 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-reloader\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485808 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-cert\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485823 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c5228111-8563-4e96-abae-b748a4677ff8-frr-startup\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485840 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485859 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-frr-conf\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485881 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ae230cf-d8e3-49d5-a336-fd028e0f5303-metallb-excludel2\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485898 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5nzl\" (UniqueName: \"kubernetes.io/projected/c5228111-8563-4e96-abae-b748a4677ff8-kube-api-access-h5nzl\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d72l\" (UniqueName: \"kubernetes.io/projected/5ae230cf-d8e3-49d5-a336-fd028e0f5303-kube-api-access-7d72l\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485941 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-frr-sockets\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485971 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228111-8563-4e96-abae-b748a4677ff8-metrics-certs\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.485999 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-metrics-certs\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.486073 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-metrics\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.486180 4744 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.486225 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/416e2fa8-29ae-42c2-a71a-863244e1b5df-cert podName:416e2fa8-29ae-42c2-a71a-863244e1b5df nodeName:}" failed. No retries permitted until 2025-09-30 03:08:38.98620826 +0000 UTC m=+846.159428234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/416e2fa8-29ae-42c2-a71a-863244e1b5df-cert") pod "frr-k8s-webhook-server-5478bdb765-bwg9t" (UID: "416e2fa8-29ae-42c2-a71a-863244e1b5df") : secret "frr-k8s-webhook-server-cert" not found Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.486687 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-frr-conf\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.486698 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-reloader\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.486913 4744 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.486996 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5228111-8563-4e96-abae-b748a4677ff8-metrics-certs podName:c5228111-8563-4e96-abae-b748a4677ff8 nodeName:}" failed. No retries permitted until 2025-09-30 03:08:38.986973074 +0000 UTC m=+846.160193048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5228111-8563-4e96-abae-b748a4677ff8-metrics-certs") pod "frr-k8s-p7zk7" (UID: "c5228111-8563-4e96-abae-b748a4677ff8") : secret "frr-k8s-certs-secret" not found Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.487210 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c5228111-8563-4e96-abae-b748a4677ff8-frr-sockets\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.487483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c5228111-8563-4e96-abae-b748a4677ff8-frr-startup\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.506920 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5nzl\" (UniqueName: \"kubernetes.io/projected/c5228111-8563-4e96-abae-b748a4677ff8-kube-api-access-h5nzl\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.507158 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplhv\" (UniqueName: \"kubernetes.io/projected/416e2fa8-29ae-42c2-a71a-863244e1b5df-kube-api-access-hplhv\") pod \"frr-k8s-webhook-server-5478bdb765-bwg9t\" (UID: \"416e2fa8-29ae-42c2-a71a-863244e1b5df\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.587437 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ae230cf-d8e3-49d5-a336-fd028e0f5303-metallb-excludel2\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.587493 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d72l\" (UniqueName: \"kubernetes.io/projected/5ae230cf-d8e3-49d5-a336-fd028e0f5303-kube-api-access-7d72l\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.587576 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-metrics-certs\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.587656 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n75dk\" (UniqueName: \"kubernetes.io/projected/15204744-c1cf-4027-8131-fd89b0544638-kube-api-access-n75dk\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.587712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-metrics-certs\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.587739 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-cert\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.587760 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.587858 4744 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.587906 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist podName:5ae230cf-d8e3-49d5-a336-fd028e0f5303 nodeName:}" failed. No retries permitted until 2025-09-30 03:08:39.087891907 +0000 UTC m=+846.261111881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist") pod "speaker-dthlc" (UID: "5ae230cf-d8e3-49d5-a336-fd028e0f5303") : secret "metallb-memberlist" not found Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.588119 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ae230cf-d8e3-49d5-a336-fd028e0f5303-metallb-excludel2\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.588769 4744 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 30 03:08:38 crc kubenswrapper[4744]: E0930 03:08:38.588827 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-metrics-certs podName:15204744-c1cf-4027-8131-fd89b0544638 nodeName:}" failed. No retries permitted until 2025-09-30 03:08:39.088812155 +0000 UTC m=+846.262032129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-metrics-certs") pod "controller-5d688f5ffc-jg7r7" (UID: "15204744-c1cf-4027-8131-fd89b0544638") : secret "controller-certs-secret" not found Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.590215 4744 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.591540 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-metrics-certs\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.605117 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-cert\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.608026 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n75dk\" (UniqueName: \"kubernetes.io/projected/15204744-c1cf-4027-8131-fd89b0544638-kube-api-access-n75dk\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.608175 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d72l\" (UniqueName: \"kubernetes.io/projected/5ae230cf-d8e3-49d5-a336-fd028e0f5303-kube-api-access-7d72l\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.992989 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228111-8563-4e96-abae-b748a4677ff8-metrics-certs\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.993081 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/416e2fa8-29ae-42c2-a71a-863244e1b5df-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bwg9t\" (UID: \"416e2fa8-29ae-42c2-a71a-863244e1b5df\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:38 crc kubenswrapper[4744]: I0930 03:08:38.999756 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/416e2fa8-29ae-42c2-a71a-863244e1b5df-cert\") pod \"frr-k8s-webhook-server-5478bdb765-bwg9t\" (UID: \"416e2fa8-29ae-42c2-a71a-863244e1b5df\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.000106 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228111-8563-4e96-abae-b748a4677ff8-metrics-certs\") pod \"frr-k8s-p7zk7\" (UID: \"c5228111-8563-4e96-abae-b748a4677ff8\") " pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.094157 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-metrics-certs\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.094225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:39 crc kubenswrapper[4744]: E0930 03:08:39.094464 4744 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 03:08:39 crc kubenswrapper[4744]: E0930 03:08:39.094550 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist podName:5ae230cf-d8e3-49d5-a336-fd028e0f5303 nodeName:}" failed. No retries permitted until 2025-09-30 03:08:40.094527596 +0000 UTC m=+847.267747590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist") pod "speaker-dthlc" (UID: "5ae230cf-d8e3-49d5-a336-fd028e0f5303") : secret "metallb-memberlist" not found Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.099003 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15204744-c1cf-4027-8131-fd89b0544638-metrics-certs\") pod \"controller-5d688f5ffc-jg7r7\" (UID: \"15204744-c1cf-4027-8131-fd89b0544638\") " pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.230199 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.238716 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.339871 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.754624 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t"] Sep 30 03:08:39 crc kubenswrapper[4744]: I0930 03:08:39.816700 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-jg7r7"] Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.108632 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.119343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ae230cf-d8e3-49d5-a336-fd028e0f5303-memberlist\") pod \"speaker-dthlc\" (UID: \"5ae230cf-d8e3-49d5-a336-fd028e0f5303\") " pod="metallb-system/speaker-dthlc" Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.224242 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dthlc" Sep 30 03:08:40 crc kubenswrapper[4744]: W0930 03:08:40.250393 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ae230cf_d8e3_49d5_a336_fd028e0f5303.slice/crio-568c3096504461d197189413334bd969de75d7ec44fb9578d3a8f6331c0b9c68 WatchSource:0}: Error finding container 568c3096504461d197189413334bd969de75d7ec44fb9578d3a8f6331c0b9c68: Status 404 returned error can't find the container with id 568c3096504461d197189413334bd969de75d7ec44fb9578d3a8f6331c0b9c68 Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.260590 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-jg7r7" event={"ID":"15204744-c1cf-4027-8131-fd89b0544638","Type":"ContainerStarted","Data":"63db8476d3ebf8f1aa2bd4a06b5a4b6aac58e813a05dd7e08189c24802bee7d1"} Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.260652 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-jg7r7" event={"ID":"15204744-c1cf-4027-8131-fd89b0544638","Type":"ContainerStarted","Data":"c7011f7d4aeabae61177092f3ecbaa6c11fe66ee1486a8a80c4ab3b72d483d56"} Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.260675 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-jg7r7" event={"ID":"15204744-c1cf-4027-8131-fd89b0544638","Type":"ContainerStarted","Data":"fffab1f945375bef1e7bb79388f965d89453f1c59c10c11ab5806489e75bccb0"} Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.260744 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.262696 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" event={"ID":"416e2fa8-29ae-42c2-a71a-863244e1b5df","Type":"ContainerStarted","Data":"a9bfb316d304378bac7697257bd1ebc2e3c70991257d80d29493e9e9d4dbf07d"} Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.264524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerStarted","Data":"24f7724853740c0c5fc8774fc549617a3051abaadaefe1e9ed6219ba473afebb"} Sep 30 03:08:40 crc kubenswrapper[4744]: I0930 03:08:40.288832 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-jg7r7" podStartSLOduration=2.288813213 podStartE2EDuration="2.288813213s" podCreationTimestamp="2025-09-30 03:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:08:40.284622073 +0000 UTC m=+847.457842087" watchObservedRunningTime="2025-09-30 03:08:40.288813213 +0000 UTC m=+847.462033197" Sep 30 03:08:41 crc kubenswrapper[4744]: I0930 03:08:41.273005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dthlc" event={"ID":"5ae230cf-d8e3-49d5-a336-fd028e0f5303","Type":"ContainerStarted","Data":"b39a93bd42e553d169c29bbf345e11dbcb89c30909e98d5173b6f4b54fac4a2d"} Sep 30 03:08:41 crc kubenswrapper[4744]: I0930 03:08:41.273324 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dthlc" event={"ID":"5ae230cf-d8e3-49d5-a336-fd028e0f5303","Type":"ContainerStarted","Data":"17351caadfd11873b520761e6829e03683f18896d44e0d66ecf53a1813d3bfcf"} Sep 30 03:08:41 crc kubenswrapper[4744]: I0930 03:08:41.273339 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dthlc" event={"ID":"5ae230cf-d8e3-49d5-a336-fd028e0f5303","Type":"ContainerStarted","Data":"568c3096504461d197189413334bd969de75d7ec44fb9578d3a8f6331c0b9c68"} Sep 30 03:08:41 crc kubenswrapper[4744]: I0930 03:08:41.273537 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dthlc" Sep 30 03:08:41 crc kubenswrapper[4744]: I0930 03:08:41.289717 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dthlc" podStartSLOduration=3.289702897 podStartE2EDuration="3.289702897s" podCreationTimestamp="2025-09-30 03:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:08:41.286393845 +0000 UTC m=+848.459613819" watchObservedRunningTime="2025-09-30 03:08:41.289702897 +0000 UTC m=+848.462922871" Sep 30 03:08:47 crc kubenswrapper[4744]: I0930 03:08:47.315957 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" event={"ID":"416e2fa8-29ae-42c2-a71a-863244e1b5df","Type":"ContainerStarted","Data":"554fd100befd8a906340250c587adf99e8079a585a7d43b6c99af721088fd602"} Sep 30 03:08:47 crc kubenswrapper[4744]: I0930 03:08:47.317013 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:47 crc kubenswrapper[4744]: I0930 03:08:47.318821 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5228111-8563-4e96-abae-b748a4677ff8" containerID="cbe8c76971c19f164abd64545375b19b60c51bc30068c29a36667a41f0057977" exitCode=0 Sep 30 03:08:47 crc kubenswrapper[4744]: I0930 03:08:47.318872 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerDied","Data":"cbe8c76971c19f164abd64545375b19b60c51bc30068c29a36667a41f0057977"} Sep 30 03:08:47 crc kubenswrapper[4744]: I0930 03:08:47.343311 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" podStartSLOduration=2.352226971 podStartE2EDuration="9.343295837s" podCreationTimestamp="2025-09-30 03:08:38 +0000 UTC" firstStartedPulling="2025-09-30 03:08:39.746760434 +0000 UTC m=+846.919980408" lastFinishedPulling="2025-09-30 03:08:46.73782929 +0000 UTC m=+853.911049274" observedRunningTime="2025-09-30 03:08:47.339426516 +0000 UTC m=+854.512646490" watchObservedRunningTime="2025-09-30 03:08:47.343295837 +0000 UTC m=+854.516515811" Sep 30 03:08:48 crc kubenswrapper[4744]: I0930 03:08:48.330010 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5228111-8563-4e96-abae-b748a4677ff8" containerID="d29a4604d4a9f525ece174b56e7df4bf05289bc9d05fe444001fb7f05b7e2957" exitCode=0 Sep 30 03:08:48 crc kubenswrapper[4744]: I0930 03:08:48.330189 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerDied","Data":"d29a4604d4a9f525ece174b56e7df4bf05289bc9d05fe444001fb7f05b7e2957"} Sep 30 03:08:49 crc kubenswrapper[4744]: I0930 03:08:49.338728 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5228111-8563-4e96-abae-b748a4677ff8" containerID="2728ca2f0922e6dd47713123244647ddc9d549b27e12a1593e2b18f2ada5dc56" exitCode=0 Sep 30 03:08:49 crc kubenswrapper[4744]: I0930 03:08:49.338832 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerDied","Data":"2728ca2f0922e6dd47713123244647ddc9d549b27e12a1593e2b18f2ada5dc56"} Sep 30 03:08:49 crc kubenswrapper[4744]: I0930 03:08:49.346217 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-jg7r7" Sep 30 03:08:50 crc kubenswrapper[4744]: I0930 03:08:50.228785 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dthlc" Sep 30 03:08:50 crc kubenswrapper[4744]: I0930 03:08:50.351237 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerStarted","Data":"3a9f0e76829c815384fd7e870eacef1a9ac82d2d9aca4aa2bcb00ec65abfaf33"} Sep 30 03:08:50 crc kubenswrapper[4744]: I0930 03:08:50.351287 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerStarted","Data":"62ef11d36f8fd195cff561dabdf46f6d8a6246f0252a7262c6ce0ac93d73969f"} Sep 30 03:08:50 crc kubenswrapper[4744]: I0930 03:08:50.351301 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerStarted","Data":"4fb0437e5f3f4273a040d4a5a22efe7d39d072e5e14d99eba4c6ea0ae8fcfb54"} Sep 30 03:08:50 crc kubenswrapper[4744]: I0930 03:08:50.351314 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerStarted","Data":"ea3652f7f7e096a791a5ae7c619eec8952eb28a5bfe78481db7b20b8ccb9b275"} Sep 30 03:08:50 crc kubenswrapper[4744]: I0930 03:08:50.351327 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerStarted","Data":"c50e4333a10216f8ad6d0aad7c3528a9d7777823d534f828027fa8dbe8a546d1"} Sep 30 03:08:51 crc kubenswrapper[4744]: I0930 03:08:51.370932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p7zk7" event={"ID":"c5228111-8563-4e96-abae-b748a4677ff8","Type":"ContainerStarted","Data":"f971c8bd7d0043f7029a01f04309d6e8fe421c289aca30ba21952d0cda66fedc"} Sep 30 03:08:51 crc kubenswrapper[4744]: I0930 03:08:51.372431 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:51 crc kubenswrapper[4744]: I0930 03:08:51.419406 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-p7zk7" podStartSLOduration=6.11109309 podStartE2EDuration="13.419356112s" podCreationTimestamp="2025-09-30 03:08:38 +0000 UTC" firstStartedPulling="2025-09-30 03:08:39.431604511 +0000 UTC m=+846.604824485" lastFinishedPulling="2025-09-30 03:08:46.739867523 +0000 UTC m=+853.913087507" observedRunningTime="2025-09-30 03:08:51.408504905 +0000 UTC m=+858.581724939" watchObservedRunningTime="2025-09-30 03:08:51.419356112 +0000 UTC m=+858.592576126" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.218172 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rn5kz"] Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.219610 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rn5kz" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.221504 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-7fnfg" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.221603 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.223016 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.236198 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rn5kz"] Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.236841 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxpf\" (UniqueName: \"kubernetes.io/projected/e9721318-a2bc-479c-83d3-09b8db844f71-kube-api-access-lwxpf\") pod \"openstack-operator-index-rn5kz\" (UID: \"e9721318-a2bc-479c-83d3-09b8db844f71\") " pod="openstack-operators/openstack-operator-index-rn5kz" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.338292 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxpf\" (UniqueName: \"kubernetes.io/projected/e9721318-a2bc-479c-83d3-09b8db844f71-kube-api-access-lwxpf\") pod \"openstack-operator-index-rn5kz\" (UID: \"e9721318-a2bc-479c-83d3-09b8db844f71\") " pod="openstack-operators/openstack-operator-index-rn5kz" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.355407 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxpf\" (UniqueName: \"kubernetes.io/projected/e9721318-a2bc-479c-83d3-09b8db844f71-kube-api-access-lwxpf\") pod \"openstack-operator-index-rn5kz\" (UID: \"e9721318-a2bc-479c-83d3-09b8db844f71\") " pod="openstack-operators/openstack-operator-index-rn5kz" Sep 30 03:08:53 crc kubenswrapper[4744]: I0930 03:08:53.538331 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rn5kz" Sep 30 03:08:54 crc kubenswrapper[4744]: I0930 03:08:54.006074 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rn5kz"] Sep 30 03:08:54 crc kubenswrapper[4744]: I0930 03:08:54.238980 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:54 crc kubenswrapper[4744]: I0930 03:08:54.310079 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:54 crc kubenswrapper[4744]: I0930 03:08:54.394832 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rn5kz" event={"ID":"e9721318-a2bc-479c-83d3-09b8db844f71","Type":"ContainerStarted","Data":"3dbc84ce1907fb0cb6238462335b3368ac73e22ddefca569ef46f7a18413ac77"} Sep 30 03:08:56 crc kubenswrapper[4744]: I0930 03:08:56.605729 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rn5kz"] Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.211297 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2drwl"] Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.212802 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.236430 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2drwl"] Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.328684 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mtrw\" (UniqueName: \"kubernetes.io/projected/f429d57e-28b9-4f82-bb1f-494d295492d1-kube-api-access-6mtrw\") pod \"openstack-operator-index-2drwl\" (UID: \"f429d57e-28b9-4f82-bb1f-494d295492d1\") " pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.416746 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rn5kz" event={"ID":"e9721318-a2bc-479c-83d3-09b8db844f71","Type":"ContainerStarted","Data":"d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713"} Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.416926 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rn5kz" podUID="e9721318-a2bc-479c-83d3-09b8db844f71" containerName="registry-server" containerID="cri-o://d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713" gracePeriod=2 Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.430433 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mtrw\" (UniqueName: \"kubernetes.io/projected/f429d57e-28b9-4f82-bb1f-494d295492d1-kube-api-access-6mtrw\") pod \"openstack-operator-index-2drwl\" (UID: \"f429d57e-28b9-4f82-bb1f-494d295492d1\") " pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.451309 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rn5kz" podStartSLOduration=1.94757606 podStartE2EDuration="4.4512746s" podCreationTimestamp="2025-09-30 03:08:53 +0000 UTC" firstStartedPulling="2025-09-30 03:08:54.028589439 +0000 UTC m=+861.201809423" lastFinishedPulling="2025-09-30 03:08:56.532287989 +0000 UTC m=+863.705507963" observedRunningTime="2025-09-30 03:08:57.437834202 +0000 UTC m=+864.611054216" watchObservedRunningTime="2025-09-30 03:08:57.4512746 +0000 UTC m=+864.624494624" Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.470327 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mtrw\" (UniqueName: \"kubernetes.io/projected/f429d57e-28b9-4f82-bb1f-494d295492d1-kube-api-access-6mtrw\") pod \"openstack-operator-index-2drwl\" (UID: \"f429d57e-28b9-4f82-bb1f-494d295492d1\") " pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.550172 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.865031 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2drwl"] Sep 30 03:08:57 crc kubenswrapper[4744]: W0930 03:08:57.871712 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf429d57e_28b9_4f82_bb1f_494d295492d1.slice/crio-8be8fe540377cd919dbd6c44c9f19c0840846136e0afd3a4fb5d56831f02e3ff WatchSource:0}: Error finding container 8be8fe540377cd919dbd6c44c9f19c0840846136e0afd3a4fb5d56831f02e3ff: Status 404 returned error can't find the container with id 8be8fe540377cd919dbd6c44c9f19c0840846136e0afd3a4fb5d56831f02e3ff Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.900909 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rn5kz" Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.935895 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwxpf\" (UniqueName: \"kubernetes.io/projected/e9721318-a2bc-479c-83d3-09b8db844f71-kube-api-access-lwxpf\") pod \"e9721318-a2bc-479c-83d3-09b8db844f71\" (UID: \"e9721318-a2bc-479c-83d3-09b8db844f71\") " Sep 30 03:08:57 crc kubenswrapper[4744]: I0930 03:08:57.943107 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9721318-a2bc-479c-83d3-09b8db844f71-kube-api-access-lwxpf" (OuterVolumeSpecName: "kube-api-access-lwxpf") pod "e9721318-a2bc-479c-83d3-09b8db844f71" (UID: "e9721318-a2bc-479c-83d3-09b8db844f71"). InnerVolumeSpecName "kube-api-access-lwxpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.039008 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwxpf\" (UniqueName: \"kubernetes.io/projected/e9721318-a2bc-479c-83d3-09b8db844f71-kube-api-access-lwxpf\") on node \"crc\" DevicePath \"\"" Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.433282 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2drwl" event={"ID":"f429d57e-28b9-4f82-bb1f-494d295492d1","Type":"ContainerStarted","Data":"f5a0d57351c51a98b53975c63d8390ae3e95b4e269c0b4fea16e6f12c5907bcb"} Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.433351 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2drwl" event={"ID":"f429d57e-28b9-4f82-bb1f-494d295492d1","Type":"ContainerStarted","Data":"8be8fe540377cd919dbd6c44c9f19c0840846136e0afd3a4fb5d56831f02e3ff"} Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.436824 4744 generic.go:334] "Generic (PLEG): container finished" podID="e9721318-a2bc-479c-83d3-09b8db844f71" containerID="d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713" exitCode=0 Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.437043 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rn5kz" event={"ID":"e9721318-a2bc-479c-83d3-09b8db844f71","Type":"ContainerDied","Data":"d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713"} Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.437098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rn5kz" event={"ID":"e9721318-a2bc-479c-83d3-09b8db844f71","Type":"ContainerDied","Data":"3dbc84ce1907fb0cb6238462335b3368ac73e22ddefca569ef46f7a18413ac77"} Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.437130 4744 scope.go:117] "RemoveContainer" containerID="d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713" Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.437297 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rn5kz" Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.455690 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2drwl" podStartSLOduration=1.3985691679999999 podStartE2EDuration="1.455660951s" podCreationTimestamp="2025-09-30 03:08:57 +0000 UTC" firstStartedPulling="2025-09-30 03:08:57.87615921 +0000 UTC m=+865.049379184" lastFinishedPulling="2025-09-30 03:08:57.933251003 +0000 UTC m=+865.106470967" observedRunningTime="2025-09-30 03:08:58.454303739 +0000 UTC m=+865.627523763" watchObservedRunningTime="2025-09-30 03:08:58.455660951 +0000 UTC m=+865.628880965" Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.471629 4744 scope.go:117] "RemoveContainer" containerID="d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713" Sep 30 03:08:58 crc kubenswrapper[4744]: E0930 03:08:58.472200 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713\": container with ID starting with d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713 not found: ID does not exist" containerID="d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713" Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.472280 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713"} err="failed to get container status \"d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713\": rpc error: code = NotFound desc = could not find container \"d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713\": container with ID starting with d23b68c16b95686a1d4f848f8f3f6a0cd3e22ee901c1575f585a597ebd856713 not found: ID does not exist" Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.498566 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rn5kz"] Sep 30 03:08:58 crc kubenswrapper[4744]: I0930 03:08:58.505942 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rn5kz"] Sep 30 03:08:59 crc kubenswrapper[4744]: I0930 03:08:59.237872 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-bwg9t" Sep 30 03:08:59 crc kubenswrapper[4744]: I0930 03:08:59.243588 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-p7zk7" Sep 30 03:08:59 crc kubenswrapper[4744]: I0930 03:08:59.513180 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9721318-a2bc-479c-83d3-09b8db844f71" path="/var/lib/kubelet/pods/e9721318-a2bc-479c-83d3-09b8db844f71/volumes" Sep 30 03:09:07 crc kubenswrapper[4744]: I0930 03:09:07.550942 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:09:07 crc kubenswrapper[4744]: I0930 03:09:07.552545 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:09:07 crc kubenswrapper[4744]: I0930 03:09:07.594504 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:09:08 crc kubenswrapper[4744]: I0930 03:09:08.552665 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-2drwl" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.377933 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq"] Sep 30 03:09:16 crc kubenswrapper[4744]: E0930 03:09:16.378659 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9721318-a2bc-479c-83d3-09b8db844f71" containerName="registry-server" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.378673 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9721318-a2bc-479c-83d3-09b8db844f71" containerName="registry-server" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.378773 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9721318-a2bc-479c-83d3-09b8db844f71" containerName="registry-server" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.379522 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.381852 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qcn27" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.388154 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq"] Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.447170 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-bundle\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.447530 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-util\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.447616 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2x5w\" (UniqueName: \"kubernetes.io/projected/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-kube-api-access-t2x5w\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.549107 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2x5w\" (UniqueName: \"kubernetes.io/projected/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-kube-api-access-t2x5w\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.549176 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-bundle\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.549195 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-util\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.549782 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-util\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.550111 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-bundle\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.576606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2x5w\" (UniqueName: \"kubernetes.io/projected/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-kube-api-access-t2x5w\") pod \"fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:16 crc kubenswrapper[4744]: I0930 03:09:16.697974 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:17 crc kubenswrapper[4744]: I0930 03:09:17.184436 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq"] Sep 30 03:09:17 crc kubenswrapper[4744]: I0930 03:09:17.577706 4744 generic.go:334] "Generic (PLEG): container finished" podID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerID="ed92803899b6e9bc2961ee7c8a23ea8b9ace359546c648f048af8cf21fc069d8" exitCode=0 Sep 30 03:09:17 crc kubenswrapper[4744]: I0930 03:09:17.577899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" event={"ID":"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef","Type":"ContainerDied","Data":"ed92803899b6e9bc2961ee7c8a23ea8b9ace359546c648f048af8cf21fc069d8"} Sep 30 03:09:17 crc kubenswrapper[4744]: I0930 03:09:17.578051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" event={"ID":"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef","Type":"ContainerStarted","Data":"a4781a61d8fad845c2c1f7d02f52a10dffcaa96990174a5f799c22bbb5d5df02"} Sep 30 03:09:18 crc kubenswrapper[4744]: I0930 03:09:18.604743 4744 generic.go:334] "Generic (PLEG): container finished" podID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerID="213340f6b8e98f35b831b75bb97acd49a346a96ceee035d9d0b7f333fb3931f0" exitCode=0 Sep 30 03:09:18 crc kubenswrapper[4744]: I0930 03:09:18.604812 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" event={"ID":"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef","Type":"ContainerDied","Data":"213340f6b8e98f35b831b75bb97acd49a346a96ceee035d9d0b7f333fb3931f0"} Sep 30 03:09:19 crc kubenswrapper[4744]: I0930 03:09:19.615660 4744 generic.go:334] "Generic (PLEG): container finished" podID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerID="6b2c62a03ce58566f35949667dcf116d48aa65961fe6bea88821755093e5c990" exitCode=0 Sep 30 03:09:19 crc kubenswrapper[4744]: I0930 03:09:19.615788 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" event={"ID":"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef","Type":"ContainerDied","Data":"6b2c62a03ce58566f35949667dcf116d48aa65961fe6bea88821755093e5c990"} Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.018146 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.135194 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-bundle\") pod \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.135625 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-util\") pod \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.135678 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2x5w\" (UniqueName: \"kubernetes.io/projected/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-kube-api-access-t2x5w\") pod \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\" (UID: \"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef\") " Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.157431 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-bundle" (OuterVolumeSpecName: "bundle") pod "9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" (UID: "9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.165753 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-kube-api-access-t2x5w" (OuterVolumeSpecName: "kube-api-access-t2x5w") pod "9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" (UID: "9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef"). InnerVolumeSpecName "kube-api-access-t2x5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.179643 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-util" (OuterVolumeSpecName: "util") pod "9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" (UID: "9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.237834 4744 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.237876 4744 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-util\") on node \"crc\" DevicePath \"\"" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.237890 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2x5w\" (UniqueName: \"kubernetes.io/projected/9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef-kube-api-access-t2x5w\") on node \"crc\" DevicePath \"\"" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.634293 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" event={"ID":"9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef","Type":"ContainerDied","Data":"a4781a61d8fad845c2c1f7d02f52a10dffcaa96990174a5f799c22bbb5d5df02"} Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.634341 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4781a61d8fad845c2c1f7d02f52a10dffcaa96990174a5f799c22bbb5d5df02" Sep 30 03:09:21 crc kubenswrapper[4744]: I0930 03:09:21.634407 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.267750 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5"] Sep 30 03:09:29 crc kubenswrapper[4744]: E0930 03:09:29.268479 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerName="extract" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.268502 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerName="extract" Sep 30 03:09:29 crc kubenswrapper[4744]: E0930 03:09:29.268519 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerName="util" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.268531 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerName="util" Sep 30 03:09:29 crc kubenswrapper[4744]: E0930 03:09:29.268547 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerName="pull" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.268558 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerName="pull" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.268721 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef" containerName="extract" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.269737 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.273398 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-s9kj9" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.313432 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5"] Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.356403 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrc8\" (UniqueName: \"kubernetes.io/projected/fec734bd-0bd4-4e73-9d2d-cd6f0f002577-kube-api-access-ctrc8\") pod \"openstack-operator-controller-operator-5d55cf86f4-4xvw5\" (UID: \"fec734bd-0bd4-4e73-9d2d-cd6f0f002577\") " pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.458446 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrc8\" (UniqueName: \"kubernetes.io/projected/fec734bd-0bd4-4e73-9d2d-cd6f0f002577-kube-api-access-ctrc8\") pod \"openstack-operator-controller-operator-5d55cf86f4-4xvw5\" (UID: \"fec734bd-0bd4-4e73-9d2d-cd6f0f002577\") " pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.488441 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrc8\" (UniqueName: \"kubernetes.io/projected/fec734bd-0bd4-4e73-9d2d-cd6f0f002577-kube-api-access-ctrc8\") pod \"openstack-operator-controller-operator-5d55cf86f4-4xvw5\" (UID: \"fec734bd-0bd4-4e73-9d2d-cd6f0f002577\") " pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.599720 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.916433 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5"] Sep 30 03:09:29 crc kubenswrapper[4744]: I0930 03:09:29.941126 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:09:30 crc kubenswrapper[4744]: I0930 03:09:30.710954 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" event={"ID":"fec734bd-0bd4-4e73-9d2d-cd6f0f002577","Type":"ContainerStarted","Data":"1cd5d8f8300c5648d17e1b7dd64659e8583adb0a17bf9b7ca751756f5dc25e5c"} Sep 30 03:09:33 crc kubenswrapper[4744]: I0930 03:09:33.734607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" event={"ID":"fec734bd-0bd4-4e73-9d2d-cd6f0f002577","Type":"ContainerStarted","Data":"26c5d36a1af534f3142550d89600715df4bd4ec305a723c140b762b1c2320108"} Sep 30 03:09:34 crc kubenswrapper[4744]: I0930 03:09:34.347787 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:09:34 crc kubenswrapper[4744]: I0930 03:09:34.348119 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:09:36 crc kubenswrapper[4744]: I0930 03:09:36.760876 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" event={"ID":"fec734bd-0bd4-4e73-9d2d-cd6f0f002577","Type":"ContainerStarted","Data":"2b34ac43e4a9ea6733cfbcf5fbaf5e207725135eb754ee9a53647e7e4cbb1c72"} Sep 30 03:09:36 crc kubenswrapper[4744]: I0930 03:09:36.762519 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" Sep 30 03:09:36 crc kubenswrapper[4744]: I0930 03:09:36.800835 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" podStartSLOduration=1.843218545 podStartE2EDuration="7.800806674s" podCreationTimestamp="2025-09-30 03:09:29 +0000 UTC" firstStartedPulling="2025-09-30 03:09:29.940427007 +0000 UTC m=+897.113646991" lastFinishedPulling="2025-09-30 03:09:35.898015136 +0000 UTC m=+903.071235120" observedRunningTime="2025-09-30 03:09:36.798792442 +0000 UTC m=+903.972012456" watchObservedRunningTime="2025-09-30 03:09:36.800806674 +0000 UTC m=+903.974026688" Sep 30 03:09:39 crc kubenswrapper[4744]: I0930 03:09:39.604671 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5d55cf86f4-4xvw5" Sep 30 03:10:04 crc kubenswrapper[4744]: I0930 03:10:04.348176 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:10:04 crc kubenswrapper[4744]: I0930 03:10:04.348769 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.375171 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.377476 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.378772 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.379726 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.384697 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-68w9c" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.386646 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.387521 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zw4q5" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.396213 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.419463 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88j8h\" (UniqueName: \"kubernetes.io/projected/464a78d3-19ea-4024-95f8-65c384a11de5-kube-api-access-88j8h\") pod \"barbican-operator-controller-manager-6ff8b75857-nh8lm\" (UID: \"464a78d3-19ea-4024-95f8-65c384a11de5\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.419518 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcg7f\" (UniqueName: \"kubernetes.io/projected/48c24f9c-7ad2-4b16-8586-a98cc6f5745d-kube-api-access-lcg7f\") pod \"cinder-operator-controller-manager-644bddb6d8-lfgcl\" (UID: \"48c24f9c-7ad2-4b16-8586-a98cc6f5745d\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.427546 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.428787 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.442065 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-8fm66" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.468240 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.469308 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.472681 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6mdz5" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.472882 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.500846 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.502830 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.509184 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-g22qt" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.521465 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zjg9\" (UniqueName: \"kubernetes.io/projected/7c60f5e8-9ac8-4729-9030-a17a74c66872-kube-api-access-7zjg9\") pod \"glance-operator-controller-manager-84958c4d49-f5wr4\" (UID: \"7c60f5e8-9ac8-4729-9030-a17a74c66872\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.521512 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxzd\" (UniqueName: \"kubernetes.io/projected/2754694b-4135-4439-ae89-dd08166467a5-kube-api-access-lqxzd\") pod \"designate-operator-controller-manager-84f4f7b77b-r4pn4\" (UID: \"2754694b-4135-4439-ae89-dd08166467a5\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.521551 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88j8h\" (UniqueName: \"kubernetes.io/projected/464a78d3-19ea-4024-95f8-65c384a11de5-kube-api-access-88j8h\") pod \"barbican-operator-controller-manager-6ff8b75857-nh8lm\" (UID: \"464a78d3-19ea-4024-95f8-65c384a11de5\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.521584 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcg7f\" (UniqueName: \"kubernetes.io/projected/48c24f9c-7ad2-4b16-8586-a98cc6f5745d-kube-api-access-lcg7f\") pod \"cinder-operator-controller-manager-644bddb6d8-lfgcl\" (UID: \"48c24f9c-7ad2-4b16-8586-a98cc6f5745d\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.523413 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.525168 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.529976 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.530478 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.533954 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.540300 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.541367 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.541821 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.546751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcg7f\" (UniqueName: \"kubernetes.io/projected/48c24f9c-7ad2-4b16-8586-a98cc6f5745d-kube-api-access-lcg7f\") pod \"cinder-operator-controller-manager-644bddb6d8-lfgcl\" (UID: \"48c24f9c-7ad2-4b16-8586-a98cc6f5745d\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.547196 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-8mktq" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.547203 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.547969 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88j8h\" (UniqueName: \"kubernetes.io/projected/464a78d3-19ea-4024-95f8-65c384a11de5-kube-api-access-88j8h\") pod \"barbican-operator-controller-manager-6ff8b75857-nh8lm\" (UID: \"464a78d3-19ea-4024-95f8-65c384a11de5\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.549306 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-6m7pn" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.564493 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.565525 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.573993 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-j4r68" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.577557 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.585059 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.586418 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.587721 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-27t69" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.593517 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.608279 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.609558 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.612800 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bvjqf" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.622340 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w667f\" (UniqueName: \"kubernetes.io/projected/1416686e-3057-4219-93e8-b6ed99e1b000-kube-api-access-w667f\") pod \"ironic-operator-controller-manager-7975b88857-plwv5\" (UID: \"1416686e-3057-4219-93e8-b6ed99e1b000\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.622677 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxzd\" (UniqueName: \"kubernetes.io/projected/2754694b-4135-4439-ae89-dd08166467a5-kube-api-access-lqxzd\") pod \"designate-operator-controller-manager-84f4f7b77b-r4pn4\" (UID: \"2754694b-4135-4439-ae89-dd08166467a5\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.622807 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8hb\" (UniqueName: \"kubernetes.io/projected/c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad-kube-api-access-pf8hb\") pod \"heat-operator-controller-manager-5d889d78cf-jrs8k\" (UID: \"c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.622882 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4pb\" (UniqueName: \"kubernetes.io/projected/a6acae25-c5f3-4719-9a0d-866cef31aae8-kube-api-access-4m4pb\") pod \"horizon-operator-controller-manager-9f4696d94-jrmql\" (UID: \"a6acae25-c5f3-4719-9a0d-866cef31aae8\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.622966 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mssrj\" (UniqueName: \"kubernetes.io/projected/befb38ef-208d-435f-820a-787301b3c4b8-kube-api-access-mssrj\") pod \"keystone-operator-controller-manager-5bd55b4bff-n4bm8\" (UID: \"befb38ef-208d-435f-820a-787301b3c4b8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.623040 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnh2\" (UniqueName: \"kubernetes.io/projected/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-kube-api-access-qjnh2\") pod \"infra-operator-controller-manager-7d857cc749-z8f6l\" (UID: \"a21e2f23-2adc-4f24-be18-72c39bb6ac8e\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.623109 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-cert\") pod \"infra-operator-controller-manager-7d857cc749-z8f6l\" (UID: \"a21e2f23-2adc-4f24-be18-72c39bb6ac8e\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.623202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zjg9\" (UniqueName: \"kubernetes.io/projected/7c60f5e8-9ac8-4729-9030-a17a74c66872-kube-api-access-7zjg9\") pod \"glance-operator-controller-manager-84958c4d49-f5wr4\" (UID: \"7c60f5e8-9ac8-4729-9030-a17a74c66872\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.624913 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-5whrj"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.626110 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.632042 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-mqg8j" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.640983 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.650714 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-5whrj"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.651772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxzd\" (UniqueName: \"kubernetes.io/projected/2754694b-4135-4439-ae89-dd08166467a5-kube-api-access-lqxzd\") pod \"designate-operator-controller-manager-84f4f7b77b-r4pn4\" (UID: \"2754694b-4135-4439-ae89-dd08166467a5\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.668267 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.669343 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.671736 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lv4rg" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.674037 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zjg9\" (UniqueName: \"kubernetes.io/projected/7c60f5e8-9ac8-4729-9030-a17a74c66872-kube-api-access-7zjg9\") pod \"glance-operator-controller-manager-84958c4d49-f5wr4\" (UID: \"7c60f5e8-9ac8-4729-9030-a17a74c66872\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.678619 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.679603 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.683567 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-v6585" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.690521 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.697707 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.719712 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.723156 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.724186 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725114 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8hb\" (UniqueName: \"kubernetes.io/projected/c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad-kube-api-access-pf8hb\") pod \"heat-operator-controller-manager-5d889d78cf-jrs8k\" (UID: \"c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725179 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4pb\" (UniqueName: \"kubernetes.io/projected/a6acae25-c5f3-4719-9a0d-866cef31aae8-kube-api-access-4m4pb\") pod \"horizon-operator-controller-manager-9f4696d94-jrmql\" (UID: \"a6acae25-c5f3-4719-9a0d-866cef31aae8\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725261 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mssrj\" (UniqueName: \"kubernetes.io/projected/befb38ef-208d-435f-820a-787301b3c4b8-kube-api-access-mssrj\") pod \"keystone-operator-controller-manager-5bd55b4bff-n4bm8\" (UID: \"befb38ef-208d-435f-820a-787301b3c4b8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725312 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnh2\" (UniqueName: \"kubernetes.io/projected/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-kube-api-access-qjnh2\") pod \"infra-operator-controller-manager-7d857cc749-z8f6l\" (UID: \"a21e2f23-2adc-4f24-be18-72c39bb6ac8e\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725351 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-cert\") pod \"infra-operator-controller-manager-7d857cc749-z8f6l\" (UID: \"a21e2f23-2adc-4f24-be18-72c39bb6ac8e\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725445 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4mk5\" (UniqueName: \"kubernetes.io/projected/edff3052-2bfd-47d9-be42-5d8f608fc529-kube-api-access-f4mk5\") pod \"manila-operator-controller-manager-6d68dbc695-xc7mx\" (UID: \"edff3052-2bfd-47d9-be42-5d8f608fc529\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725544 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcf49\" (UniqueName: \"kubernetes.io/projected/adaa00a7-7a31-40ae-975e-47306e8128e8-kube-api-access-wcf49\") pod \"neutron-operator-controller-manager-64d7b59854-hl4qt\" (UID: \"adaa00a7-7a31-40ae-975e-47306e8128e8\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w667f\" (UniqueName: \"kubernetes.io/projected/1416686e-3057-4219-93e8-b6ed99e1b000-kube-api-access-w667f\") pod \"ironic-operator-controller-manager-7975b88857-plwv5\" (UID: \"1416686e-3057-4219-93e8-b6ed99e1b000\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725676 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrdzc\" (UniqueName: \"kubernetes.io/projected/be907fa2-e5ce-461e-bad7-7ff67b7b28fc-kube-api-access-vrdzc\") pod \"nova-operator-controller-manager-c7c776c96-hjmhz\" (UID: \"be907fa2-e5ce-461e-bad7-7ff67b7b28fc\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725741 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsm9w\" (UniqueName: \"kubernetes.io/projected/7955a9b4-f81b-45cd-bc57-b96bef24b064-kube-api-access-gsm9w\") pod \"mariadb-operator-controller-manager-88c7-5whrj\" (UID: \"7955a9b4-f81b-45cd-bc57-b96bef24b064\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.725954 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qwpzp" Sep 30 03:10:13 crc kubenswrapper[4744]: E0930 03:10:13.726199 4744 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 03:10:13 crc kubenswrapper[4744]: E0930 03:10:13.726246 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-cert podName:a21e2f23-2adc-4f24-be18-72c39bb6ac8e nodeName:}" failed. No retries permitted until 2025-09-30 03:10:14.226229577 +0000 UTC m=+941.399449551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-cert") pod "infra-operator-controller-manager-7d857cc749-z8f6l" (UID: "a21e2f23-2adc-4f24-be18-72c39bb6ac8e") : secret "infra-operator-webhook-server-cert" not found Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.733653 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.750218 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8hb\" (UniqueName: \"kubernetes.io/projected/c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad-kube-api-access-pf8hb\") pod \"heat-operator-controller-manager-5d889d78cf-jrs8k\" (UID: \"c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.751298 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnh2\" (UniqueName: \"kubernetes.io/projected/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-kube-api-access-qjnh2\") pod \"infra-operator-controller-manager-7d857cc749-z8f6l\" (UID: \"a21e2f23-2adc-4f24-be18-72c39bb6ac8e\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.751999 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4pb\" (UniqueName: \"kubernetes.io/projected/a6acae25-c5f3-4719-9a0d-866cef31aae8-kube-api-access-4m4pb\") pod \"horizon-operator-controller-manager-9f4696d94-jrmql\" (UID: \"a6acae25-c5f3-4719-9a0d-866cef31aae8\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.752019 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w667f\" (UniqueName: \"kubernetes.io/projected/1416686e-3057-4219-93e8-b6ed99e1b000-kube-api-access-w667f\") pod \"ironic-operator-controller-manager-7975b88857-plwv5\" (UID: \"1416686e-3057-4219-93e8-b6ed99e1b000\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.759157 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mssrj\" (UniqueName: \"kubernetes.io/projected/befb38ef-208d-435f-820a-787301b3c4b8-kube-api-access-mssrj\") pod \"keystone-operator-controller-manager-5bd55b4bff-n4bm8\" (UID: \"befb38ef-208d-435f-820a-787301b3c4b8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.761429 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.761826 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.781357 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.782599 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.790240 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cjp4n" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.818172 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.833739 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4mk5\" (UniqueName: \"kubernetes.io/projected/edff3052-2bfd-47d9-be42-5d8f608fc529-kube-api-access-f4mk5\") pod \"manila-operator-controller-manager-6d68dbc695-xc7mx\" (UID: \"edff3052-2bfd-47d9-be42-5d8f608fc529\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.833790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcf49\" (UniqueName: \"kubernetes.io/projected/adaa00a7-7a31-40ae-975e-47306e8128e8-kube-api-access-wcf49\") pod \"neutron-operator-controller-manager-64d7b59854-hl4qt\" (UID: \"adaa00a7-7a31-40ae-975e-47306e8128e8\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.833911 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrdzc\" (UniqueName: \"kubernetes.io/projected/be907fa2-e5ce-461e-bad7-7ff67b7b28fc-kube-api-access-vrdzc\") pod \"nova-operator-controller-manager-c7c776c96-hjmhz\" (UID: \"be907fa2-e5ce-461e-bad7-7ff67b7b28fc\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.833999 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsm9w\" (UniqueName: \"kubernetes.io/projected/7955a9b4-f81b-45cd-bc57-b96bef24b064-kube-api-access-gsm9w\") pod \"mariadb-operator-controller-manager-88c7-5whrj\" (UID: \"7955a9b4-f81b-45cd-bc57-b96bef24b064\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.834031 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.834031 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbqtx\" (UniqueName: \"kubernetes.io/projected/bcc255bc-d09d-4f16-b541-4e206fb39a80-kube-api-access-wbqtx\") pod \"octavia-operator-controller-manager-76fcc6dc7c-zvxch\" (UID: \"bcc255bc-d09d-4f16-b541-4e206fb39a80\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.834647 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhkv8\" (UniqueName: \"kubernetes.io/projected/015eb732-be5e-404f-81e2-b43d012c356b-kube-api-access-bhkv8\") pod \"placement-operator-controller-manager-589c58c6c-pcn79\" (UID: \"015eb732-be5e-404f-81e2-b43d012c356b\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.843074 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.844305 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.848792 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hpg78" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.850138 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.851430 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.852893 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.856137 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pwtns" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.856712 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.858802 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsm9w\" (UniqueName: \"kubernetes.io/projected/7955a9b4-f81b-45cd-bc57-b96bef24b064-kube-api-access-gsm9w\") pod \"mariadb-operator-controller-manager-88c7-5whrj\" (UID: \"7955a9b4-f81b-45cd-bc57-b96bef24b064\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.858872 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrdzc\" (UniqueName: \"kubernetes.io/projected/be907fa2-e5ce-461e-bad7-7ff67b7b28fc-kube-api-access-vrdzc\") pod \"nova-operator-controller-manager-c7c776c96-hjmhz\" (UID: \"be907fa2-e5ce-461e-bad7-7ff67b7b28fc\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.861926 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.862054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4mk5\" (UniqueName: \"kubernetes.io/projected/edff3052-2bfd-47d9-be42-5d8f608fc529-kube-api-access-f4mk5\") pod \"manila-operator-controller-manager-6d68dbc695-xc7mx\" (UID: \"edff3052-2bfd-47d9-be42-5d8f608fc529\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.871516 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcf49\" (UniqueName: \"kubernetes.io/projected/adaa00a7-7a31-40ae-975e-47306e8128e8-kube-api-access-wcf49\") pod \"neutron-operator-controller-manager-64d7b59854-hl4qt\" (UID: \"adaa00a7-7a31-40ae-975e-47306e8128e8\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.871589 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.881367 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.887558 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.889677 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.893801 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gbnkv" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.898490 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.905707 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.915848 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.917449 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.922903 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4gb6t" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.923629 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.927433 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.929835 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.940047 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmgf\" (UniqueName: \"kubernetes.io/projected/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-kube-api-access-znmgf\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.940093 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdcw\" (UniqueName: \"kubernetes.io/projected/c46f8a8d-f07f-4983-9971-6b06d47c8e38-kube-api-access-zmdcw\") pod \"swift-operator-controller-manager-bc7dc7bd9-g6w7n\" (UID: \"c46f8a8d-f07f-4983-9971-6b06d47c8e38\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.940113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.940141 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtcr\" (UniqueName: \"kubernetes.io/projected/5c70f54b-6405-4dcc-a2d2-e989b2516f0e-kube-api-access-frtcr\") pod \"ovn-operator-controller-manager-9976ff44c-kd8v7\" (UID: \"5c70f54b-6405-4dcc-a2d2-e989b2516f0e\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.940169 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbqtx\" (UniqueName: \"kubernetes.io/projected/bcc255bc-d09d-4f16-b541-4e206fb39a80-kube-api-access-wbqtx\") pod \"octavia-operator-controller-manager-76fcc6dc7c-zvxch\" (UID: \"bcc255bc-d09d-4f16-b541-4e206fb39a80\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.940203 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhkv8\" (UniqueName: \"kubernetes.io/projected/015eb732-be5e-404f-81e2-b43d012c356b-kube-api-access-bhkv8\") pod \"placement-operator-controller-manager-589c58c6c-pcn79\" (UID: \"015eb732-be5e-404f-81e2-b43d012c356b\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.949780 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.950902 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.951896 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.962439 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8s9wm" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.983708 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9"] Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.989551 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhkv8\" (UniqueName: \"kubernetes.io/projected/015eb732-be5e-404f-81e2-b43d012c356b-kube-api-access-bhkv8\") pod \"placement-operator-controller-manager-589c58c6c-pcn79\" (UID: \"015eb732-be5e-404f-81e2-b43d012c356b\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" Sep 30 03:10:13 crc kubenswrapper[4744]: I0930 03:10:13.997297 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.005984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbqtx\" (UniqueName: \"kubernetes.io/projected/bcc255bc-d09d-4f16-b541-4e206fb39a80-kube-api-access-wbqtx\") pod \"octavia-operator-controller-manager-76fcc6dc7c-zvxch\" (UID: \"bcc255bc-d09d-4f16-b541-4e206fb39a80\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.008305 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.008469 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.009622 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.013237 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-p5sdw" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.018708 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.042557 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdq7\" (UniqueName: \"kubernetes.io/projected/5ae4d03d-68a5-498a-992f-df43dbeebc73-kube-api-access-trdq7\") pod \"telemetry-operator-controller-manager-b8d54b5d7-5fx9s\" (UID: \"5ae4d03d-68a5-498a-992f-df43dbeebc73\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.042664 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmgf\" (UniqueName: \"kubernetes.io/projected/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-kube-api-access-znmgf\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.042703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdcw\" (UniqueName: \"kubernetes.io/projected/c46f8a8d-f07f-4983-9971-6b06d47c8e38-kube-api-access-zmdcw\") pod \"swift-operator-controller-manager-bc7dc7bd9-g6w7n\" (UID: \"c46f8a8d-f07f-4983-9971-6b06d47c8e38\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.042722 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.042744 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cszb\" (UniqueName: \"kubernetes.io/projected/5502e6e0-3d2f-479c-a53f-005bbb749631-kube-api-access-8cszb\") pod \"test-operator-controller-manager-f66b554c6-p5zk9\" (UID: \"5502e6e0-3d2f-479c-a53f-005bbb749631\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.042783 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frtcr\" (UniqueName: \"kubernetes.io/projected/5c70f54b-6405-4dcc-a2d2-e989b2516f0e-kube-api-access-frtcr\") pod \"ovn-operator-controller-manager-9976ff44c-kd8v7\" (UID: \"5c70f54b-6405-4dcc-a2d2-e989b2516f0e\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" Sep 30 03:10:14 crc kubenswrapper[4744]: E0930 03:10:14.043352 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 03:10:14 crc kubenswrapper[4744]: E0930 03:10:14.043413 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert podName:2fdc94bc-95cf-4a16-a6cc-0d277f4969bc nodeName:}" failed. No retries permitted until 2025-09-30 03:10:14.543398834 +0000 UTC m=+941.716618808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-gmkqb" (UID: "2fdc94bc-95cf-4a16-a6cc-0d277f4969bc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.065984 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdcw\" (UniqueName: \"kubernetes.io/projected/c46f8a8d-f07f-4983-9971-6b06d47c8e38-kube-api-access-zmdcw\") pod \"swift-operator-controller-manager-bc7dc7bd9-g6w7n\" (UID: \"c46f8a8d-f07f-4983-9971-6b06d47c8e38\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.074927 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtcr\" (UniqueName: \"kubernetes.io/projected/5c70f54b-6405-4dcc-a2d2-e989b2516f0e-kube-api-access-frtcr\") pod \"ovn-operator-controller-manager-9976ff44c-kd8v7\" (UID: \"5c70f54b-6405-4dcc-a2d2-e989b2516f0e\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.097955 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmgf\" (UniqueName: \"kubernetes.io/projected/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-kube-api-access-znmgf\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.114014 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.114188 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.135311 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.144770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4zdn\" (UniqueName: \"kubernetes.io/projected/6293cdef-44a9-4639-a40d-df02e9aa8410-kube-api-access-s4zdn\") pod \"watcher-operator-controller-manager-76669f99c-pvzfj\" (UID: \"6293cdef-44a9-4639-a40d-df02e9aa8410\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.144834 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cszb\" (UniqueName: \"kubernetes.io/projected/5502e6e0-3d2f-479c-a53f-005bbb749631-kube-api-access-8cszb\") pod \"test-operator-controller-manager-f66b554c6-p5zk9\" (UID: \"5502e6e0-3d2f-479c-a53f-005bbb749631\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.144904 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trdq7\" (UniqueName: \"kubernetes.io/projected/5ae4d03d-68a5-498a-992f-df43dbeebc73-kube-api-access-trdq7\") pod \"telemetry-operator-controller-manager-b8d54b5d7-5fx9s\" (UID: \"5ae4d03d-68a5-498a-992f-df43dbeebc73\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.160312 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.163595 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.175013 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.175578 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cszb\" (UniqueName: \"kubernetes.io/projected/5502e6e0-3d2f-479c-a53f-005bbb749631-kube-api-access-8cszb\") pod \"test-operator-controller-manager-f66b554c6-p5zk9\" (UID: \"5502e6e0-3d2f-479c-a53f-005bbb749631\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.180764 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9qml7" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.185027 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdq7\" (UniqueName: \"kubernetes.io/projected/5ae4d03d-68a5-498a-992f-df43dbeebc73-kube-api-access-trdq7\") pod \"telemetry-operator-controller-manager-b8d54b5d7-5fx9s\" (UID: \"5ae4d03d-68a5-498a-992f-df43dbeebc73\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.195028 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.196739 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.228168 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.229051 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.234905 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.236518 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6zzcs" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.247429 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-cert\") pod \"infra-operator-controller-manager-7d857cc749-z8f6l\" (UID: \"a21e2f23-2adc-4f24-be18-72c39bb6ac8e\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.247505 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4zdn\" (UniqueName: \"kubernetes.io/projected/6293cdef-44a9-4639-a40d-df02e9aa8410-kube-api-access-s4zdn\") pod \"watcher-operator-controller-manager-76669f99c-pvzfj\" (UID: \"6293cdef-44a9-4639-a40d-df02e9aa8410\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.247562 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1dc02df0-0d0c-49bc-b3e8-69efc93c3167-cert\") pod \"openstack-operator-controller-manager-5ff84bd547-bw5gx\" (UID: \"1dc02df0-0d0c-49bc-b3e8-69efc93c3167\") " pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.247598 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxnk\" (UniqueName: \"kubernetes.io/projected/1dc02df0-0d0c-49bc-b3e8-69efc93c3167-kube-api-access-mnxnk\") pod \"openstack-operator-controller-manager-5ff84bd547-bw5gx\" (UID: \"1dc02df0-0d0c-49bc-b3e8-69efc93c3167\") " pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.255813 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a21e2f23-2adc-4f24-be18-72c39bb6ac8e-cert\") pod \"infra-operator-controller-manager-7d857cc749-z8f6l\" (UID: \"a21e2f23-2adc-4f24-be18-72c39bb6ac8e\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.282054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4zdn\" (UniqueName: \"kubernetes.io/projected/6293cdef-44a9-4639-a40d-df02e9aa8410-kube-api-access-s4zdn\") pod \"watcher-operator-controller-manager-76669f99c-pvzfj\" (UID: \"6293cdef-44a9-4639-a40d-df02e9aa8410\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.349030 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvkl\" (UniqueName: \"kubernetes.io/projected/bb137b23-9366-4d2c-bc9d-ec50ccaef237-kube-api-access-hrvkl\") pod \"rabbitmq-cluster-operator-manager-79d8469568-2vpmm\" (UID: \"bb137b23-9366-4d2c-bc9d-ec50ccaef237\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.349076 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1dc02df0-0d0c-49bc-b3e8-69efc93c3167-cert\") pod \"openstack-operator-controller-manager-5ff84bd547-bw5gx\" (UID: \"1dc02df0-0d0c-49bc-b3e8-69efc93c3167\") " pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.349109 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxnk\" (UniqueName: \"kubernetes.io/projected/1dc02df0-0d0c-49bc-b3e8-69efc93c3167-kube-api-access-mnxnk\") pod \"openstack-operator-controller-manager-5ff84bd547-bw5gx\" (UID: \"1dc02df0-0d0c-49bc-b3e8-69efc93c3167\") " pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.359006 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1dc02df0-0d0c-49bc-b3e8-69efc93c3167-cert\") pod \"openstack-operator-controller-manager-5ff84bd547-bw5gx\" (UID: \"1dc02df0-0d0c-49bc-b3e8-69efc93c3167\") " pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.363053 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxnk\" (UniqueName: \"kubernetes.io/projected/1dc02df0-0d0c-49bc-b3e8-69efc93c3167-kube-api-access-mnxnk\") pod \"openstack-operator-controller-manager-5ff84bd547-bw5gx\" (UID: \"1dc02df0-0d0c-49bc-b3e8-69efc93c3167\") " pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.428116 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.445082 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.450914 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvkl\" (UniqueName: \"kubernetes.io/projected/bb137b23-9366-4d2c-bc9d-ec50ccaef237-kube-api-access-hrvkl\") pod \"rabbitmq-cluster-operator-manager-79d8469568-2vpmm\" (UID: \"bb137b23-9366-4d2c-bc9d-ec50ccaef237\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.473703 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.474394 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvkl\" (UniqueName: \"kubernetes.io/projected/bb137b23-9366-4d2c-bc9d-ec50ccaef237-kube-api-access-hrvkl\") pod \"rabbitmq-cluster-operator-manager-79d8469568-2vpmm\" (UID: \"bb137b23-9366-4d2c-bc9d-ec50ccaef237\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.494064 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.515726 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.554803 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:14 crc kubenswrapper[4744]: E0930 03:10:14.555511 4744 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 03:10:14 crc kubenswrapper[4744]: E0930 03:10:14.555557 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert podName:2fdc94bc-95cf-4a16-a6cc-0d277f4969bc nodeName:}" failed. No retries permitted until 2025-09-30 03:10:15.555544624 +0000 UTC m=+942.728764598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-gmkqb" (UID: "2fdc94bc-95cf-4a16-a6cc-0d277f4969bc") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.664198 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.687198 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.769574 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.777955 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4"] Sep 30 03:10:14 crc kubenswrapper[4744]: I0930 03:10:14.784529 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm"] Sep 30 03:10:14 crc kubenswrapper[4744]: W0930 03:10:14.789030 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c60f5e8_9ac8_4729_9030_a17a74c66872.slice/crio-fbd76a04d4b044b65b4f077b5530e0a97aa55eb90c57d83829530dc3bdcb258f WatchSource:0}: Error finding container fbd76a04d4b044b65b4f077b5530e0a97aa55eb90c57d83829530dc3bdcb258f: Status 404 returned error can't find the container with id fbd76a04d4b044b65b4f077b5530e0a97aa55eb90c57d83829530dc3bdcb258f Sep 30 03:10:14 crc kubenswrapper[4744]: W0930 03:10:14.792475 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod464a78d3_19ea_4024_95f8_65c384a11de5.slice/crio-39a4ed533809dd45adf7bd0667d0d9f2ad663d5bfeb8f2aedf25bb53429b8b01 WatchSource:0}: Error finding container 39a4ed533809dd45adf7bd0667d0d9f2ad663d5bfeb8f2aedf25bb53429b8b01: Status 404 returned error can't find the container with id 39a4ed533809dd45adf7bd0667d0d9f2ad663d5bfeb8f2aedf25bb53429b8b01 Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.073198 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" event={"ID":"7c60f5e8-9ac8-4729-9030-a17a74c66872","Type":"ContainerStarted","Data":"fbd76a04d4b044b65b4f077b5530e0a97aa55eb90c57d83829530dc3bdcb258f"} Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.074861 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" event={"ID":"48c24f9c-7ad2-4b16-8586-a98cc6f5745d","Type":"ContainerStarted","Data":"9f1ea228f02314392c7a86f8e81d0224e5d7fb777af9d4d1a362eb6f58dfa267"} Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.075916 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" event={"ID":"464a78d3-19ea-4024-95f8-65c384a11de5","Type":"ContainerStarted","Data":"39a4ed533809dd45adf7bd0667d0d9f2ad663d5bfeb8f2aedf25bb53429b8b01"} Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.077250 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" event={"ID":"2754694b-4135-4439-ae89-dd08166467a5","Type":"ContainerStarted","Data":"5c97461c2ec5f5dc72f61202d6748a4194c50b651f76d70103bef0d2d6a9bcee"} Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.177750 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c70f54b_6405_4dcc_a2d2_e989b2516f0e.slice/crio-217f23c905fe33e5d3755d517f3ce096a8a75a6e25dfd5ff57aa698d95ca8c54 WatchSource:0}: Error finding container 217f23c905fe33e5d3755d517f3ce096a8a75a6e25dfd5ff57aa698d95ca8c54: Status 404 returned error can't find the container with id 217f23c905fe33e5d3755d517f3ce096a8a75a6e25dfd5ff57aa698d95ca8c54 Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.181670 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.191114 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-5whrj"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.204936 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8"] Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.209955 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7955a9b4_f81b_45cd_bc57_b96bef24b064.slice/crio-5397d10ad9bde702ba1530c0fb071739f202b3f4606c2224d4d47e956eae774f WatchSource:0}: Error finding container 5397d10ad9bde702ba1530c0fb071739f202b3f4606c2224d4d47e956eae774f: Status 404 returned error can't find the container with id 5397d10ad9bde702ba1530c0fb071739f202b3f4606c2224d4d47e956eae774f Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.217631 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.221204 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.233472 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.244912 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k"] Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.249862 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6acae25_c5f3_4719_9a0d_866cef31aae8.slice/crio-ae301675e7a60aea91f4af7e6e98c2ccf69fbe6cedda1433b4dfc4e08f1a981d WatchSource:0}: Error finding container ae301675e7a60aea91f4af7e6e98c2ccf69fbe6cedda1433b4dfc4e08f1a981d: Status 404 returned error can't find the container with id ae301675e7a60aea91f4af7e6e98c2ccf69fbe6cedda1433b4dfc4e08f1a981d Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.251555 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql"] Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.252656 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a11cfd_aa9b_4aaf_9d4f_59b7308620ad.slice/crio-70709272f65a374bfbc8c3ea652063695025f00a7f415780a0b0017317adeb19 WatchSource:0}: Error finding container 70709272f65a374bfbc8c3ea652063695025f00a7f415780a0b0017317adeb19: Status 404 returned error can't find the container with id 70709272f65a374bfbc8c3ea652063695025f00a7f415780a0b0017317adeb19 Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.256816 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadaa00a7_7a31_40ae_975e_47306e8128e8.slice/crio-aab7bdf7b5ebddbb6f098ffeea9d5e8c03c0983b3a056cd87e9cd073d275fb6f WatchSource:0}: Error finding container aab7bdf7b5ebddbb6f098ffeea9d5e8c03c0983b3a056cd87e9cd073d275fb6f: Status 404 returned error can't find the container with id aab7bdf7b5ebddbb6f098ffeea9d5e8c03c0983b3a056cd87e9cd073d275fb6f Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.258960 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch"] Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.265536 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc255bc_d09d_4f16_b541_4e206fb39a80.slice/crio-79bebe71578805ef368c8fbe7dcb01cea773582c9648ec8f74436488e3afa959 WatchSource:0}: Error finding container 79bebe71578805ef368c8fbe7dcb01cea773582c9648ec8f74436488e3afa959: Status 404 returned error can't find the container with id 79bebe71578805ef368c8fbe7dcb01cea773582c9648ec8f74436488e3afa959 Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.268503 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wbqtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-76fcc6dc7c-zvxch_openstack-operators(bcc255bc-d09d-4f16-b541-4e206fb39a80): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.268957 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wcf49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64d7b59854-hl4qt_openstack-operators(adaa00a7-7a31-40ae-975e-47306e8128e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.277744 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.284090 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79"] Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.289269 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhkv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-pcn79_openstack-operators(015eb732-be5e-404f-81e2-b43d012c356b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.290346 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s"] Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.320549 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-trdq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-5fx9s_openstack-operators(5ae4d03d-68a5-498a-992f-df43dbeebc73): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.461774 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" podUID="adaa00a7-7a31-40ae-975e-47306e8128e8" Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.478780 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" podUID="bcc255bc-d09d-4f16-b541-4e206fb39a80" Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.484877 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm"] Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.492498 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" podUID="015eb732-be5e-404f-81e2-b43d012c356b" Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.494167 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n"] Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.495276 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" podUID="5ae4d03d-68a5-498a-992f-df43dbeebc73" Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.500282 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc46f8a8d_f07f_4983_9971_6b06d47c8e38.slice/crio-8674a6e71dd21600f99e6203369f6f0c3894847e0ec431837d18ffd76bcb5ca0 WatchSource:0}: Error finding container 8674a6e71dd21600f99e6203369f6f0c3894847e0ec431837d18ffd76bcb5ca0: Status 404 returned error can't find the container with id 8674a6e71dd21600f99e6203369f6f0c3894847e0ec431837d18ffd76bcb5ca0 Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.502812 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21e2f23_2adc_4f24_be18_72c39bb6ac8e.slice/crio-3456e0fad25c93e834573bc085614f375037e7860cac48c98da0ad62b58e9c81 WatchSource:0}: Error finding container 3456e0fad25c93e834573bc085614f375037e7860cac48c98da0ad62b58e9c81: Status 404 returned error can't find the container with id 3456e0fad25c93e834573bc085614f375037e7860cac48c98da0ad62b58e9c81 Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.504478 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5502e6e0_3d2f_479c_a53f_005bbb749631.slice/crio-8c156d5f21ed96467f25c56fbd5fc0d20d93ca0b14ff4acc35370836b5aea22e WatchSource:0}: Error finding container 8c156d5f21ed96467f25c56fbd5fc0d20d93ca0b14ff4acc35370836b5aea22e: Status 404 returned error can't find the container with id 8c156d5f21ed96467f25c56fbd5fc0d20d93ca0b14ff4acc35370836b5aea22e Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.509912 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8cszb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-f66b554c6-p5zk9_openstack-operators(5502e6e0-3d2f-479c-a53f-005bbb749631): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.514751 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb137b23_9366_4d2c_bc9d_ec50ccaef237.slice/crio-5e75c8fe3e9d2b906af1bc5c3c52ddfee6130745fc9f5cc2342819abb4539ac2 WatchSource:0}: Error finding container 5e75c8fe3e9d2b906af1bc5c3c52ddfee6130745fc9f5cc2342819abb4539ac2: Status 404 returned error can't find the container with id 5e75c8fe3e9d2b906af1bc5c3c52ddfee6130745fc9f5cc2342819abb4539ac2 Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.536757 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrvkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-2vpmm_openstack-operators(bb137b23-9366-4d2c-bc9d-ec50ccaef237): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.536757 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjnh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7d857cc749-z8f6l_openstack-operators(a21e2f23-2adc-4f24-be18-72c39bb6ac8e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 03:10:15 crc kubenswrapper[4744]: W0930 03:10:15.536837 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc02df0_0d0c_49bc_b3e8_69efc93c3167.slice/crio-e03ac6fe38ecb8d9cf36ce58b840c8377ed791192af2db607d44a784b81a0e95 WatchSource:0}: Error finding container e03ac6fe38ecb8d9cf36ce58b840c8377ed791192af2db607d44a784b81a0e95: Status 404 returned error can't find the container with id e03ac6fe38ecb8d9cf36ce58b840c8377ed791192af2db607d44a784b81a0e95 Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.537938 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" podUID="bb137b23-9366-4d2c-bc9d-ec50ccaef237" Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.540328 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.540354 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.540364 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.540388 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj"] Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.570210 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.577362 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2fdc94bc-95cf-4a16-a6cc-0d277f4969bc-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-gmkqb\" (UID: \"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:15 crc kubenswrapper[4744]: I0930 03:10:15.753819 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.769389 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" podUID="5502e6e0-3d2f-479c-a53f-005bbb749631" Sep 30 03:10:15 crc kubenswrapper[4744]: E0930 03:10:15.770848 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" podUID="a21e2f23-2adc-4f24-be18-72c39bb6ac8e" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.121245 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" event={"ID":"adaa00a7-7a31-40ae-975e-47306e8128e8","Type":"ContainerStarted","Data":"16f3b0c712e23171e3b083ff13fee12aa0f8854d750cefeaa6dc02a8d39bbb43"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.121577 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" event={"ID":"adaa00a7-7a31-40ae-975e-47306e8128e8","Type":"ContainerStarted","Data":"aab7bdf7b5ebddbb6f098ffeea9d5e8c03c0983b3a056cd87e9cd073d275fb6f"} Sep 30 03:10:16 crc kubenswrapper[4744]: E0930 03:10:16.123819 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" podUID="adaa00a7-7a31-40ae-975e-47306e8128e8" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.131558 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" event={"ID":"7955a9b4-f81b-45cd-bc57-b96bef24b064","Type":"ContainerStarted","Data":"5397d10ad9bde702ba1530c0fb071739f202b3f4606c2224d4d47e956eae774f"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.135647 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" event={"ID":"5ae4d03d-68a5-498a-992f-df43dbeebc73","Type":"ContainerStarted","Data":"8ca134abb64a5d2be48df8e0c0d34a4a61e83baaf18a7fb18f056816aa9baad3"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.135686 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" event={"ID":"5ae4d03d-68a5-498a-992f-df43dbeebc73","Type":"ContainerStarted","Data":"838a55aff0a0d40dccf5d016b2ad7d1fd26f6301334e5cd42f51a6c5a4f8fd3e"} Sep 30 03:10:16 crc kubenswrapper[4744]: E0930 03:10:16.143772 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" podUID="5ae4d03d-68a5-498a-992f-df43dbeebc73" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.145769 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" event={"ID":"5c70f54b-6405-4dcc-a2d2-e989b2516f0e","Type":"ContainerStarted","Data":"217f23c905fe33e5d3755d517f3ce096a8a75a6e25dfd5ff57aa698d95ca8c54"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.146889 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" event={"ID":"a6acae25-c5f3-4719-9a0d-866cef31aae8","Type":"ContainerStarted","Data":"ae301675e7a60aea91f4af7e6e98c2ccf69fbe6cedda1433b4dfc4e08f1a981d"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.162584 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" event={"ID":"be907fa2-e5ce-461e-bad7-7ff67b7b28fc","Type":"ContainerStarted","Data":"6e325218cc3016ba31b241b04537edef55ad9b15a28b5733171e19222aea34b3"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.164642 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" event={"ID":"6293cdef-44a9-4639-a40d-df02e9aa8410","Type":"ContainerStarted","Data":"4d124fd8b0084f8c19744e88a982bbdff13dc87396cf382eef1b360ecdc43cf2"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.224820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" event={"ID":"1dc02df0-0d0c-49bc-b3e8-69efc93c3167","Type":"ContainerStarted","Data":"28ecf945abbaa856399d602ae601d960db1981b2cda043dfa3e0b5d3838109c6"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.224863 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" event={"ID":"1dc02df0-0d0c-49bc-b3e8-69efc93c3167","Type":"ContainerStarted","Data":"0f41c2fc3c6478f7259f142ec62025e062710e7b948a920853919cb0c593f89e"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.224875 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" event={"ID":"1dc02df0-0d0c-49bc-b3e8-69efc93c3167","Type":"ContainerStarted","Data":"e03ac6fe38ecb8d9cf36ce58b840c8377ed791192af2db607d44a784b81a0e95"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.224905 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.230398 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" event={"ID":"bb137b23-9366-4d2c-bc9d-ec50ccaef237","Type":"ContainerStarted","Data":"5e75c8fe3e9d2b906af1bc5c3c52ddfee6130745fc9f5cc2342819abb4539ac2"} Sep 30 03:10:16 crc kubenswrapper[4744]: E0930 03:10:16.231714 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" podUID="bb137b23-9366-4d2c-bc9d-ec50ccaef237" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.234097 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" event={"ID":"5502e6e0-3d2f-479c-a53f-005bbb749631","Type":"ContainerStarted","Data":"55d695db5f1a54605cb3dafa69d800ab61a10f6b698d40584eb05a111f664861"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.234120 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" event={"ID":"5502e6e0-3d2f-479c-a53f-005bbb749631","Type":"ContainerStarted","Data":"8c156d5f21ed96467f25c56fbd5fc0d20d93ca0b14ff4acc35370836b5aea22e"} Sep 30 03:10:16 crc kubenswrapper[4744]: E0930 03:10:16.235119 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" podUID="5502e6e0-3d2f-479c-a53f-005bbb749631" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.235333 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" event={"ID":"1416686e-3057-4219-93e8-b6ed99e1b000","Type":"ContainerStarted","Data":"d8382f89d659cc551559ba58ef1d15b38ea02c90b85e2100cd8d3c22d7743218"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.236933 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" event={"ID":"a21e2f23-2adc-4f24-be18-72c39bb6ac8e","Type":"ContainerStarted","Data":"cafe0b38daf420c0b8151d6858fdfa9b6a3157a6743379dac7115bd765e8abcf"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.236957 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" event={"ID":"a21e2f23-2adc-4f24-be18-72c39bb6ac8e","Type":"ContainerStarted","Data":"3456e0fad25c93e834573bc085614f375037e7860cac48c98da0ad62b58e9c81"} Sep 30 03:10:16 crc kubenswrapper[4744]: E0930 03:10:16.237878 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" podUID="a21e2f23-2adc-4f24-be18-72c39bb6ac8e" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.261439 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" event={"ID":"befb38ef-208d-435f-820a-787301b3c4b8","Type":"ContainerStarted","Data":"41172efa0c075208faca801a57c962973a0d3ceaf89725d5dcc95841b339ac71"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.265480 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" podStartSLOduration=2.265465691 podStartE2EDuration="2.265465691s" podCreationTimestamp="2025-09-30 03:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:10:16.248974088 +0000 UTC m=+943.422194052" watchObservedRunningTime="2025-09-30 03:10:16.265465691 +0000 UTC m=+943.438685665" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.280163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" event={"ID":"bcc255bc-d09d-4f16-b541-4e206fb39a80","Type":"ContainerStarted","Data":"1aab8683de09c3d5be3fc09f9d2001b7f2ddc034a7bae1a38cc8fead73259829"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.280205 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" event={"ID":"bcc255bc-d09d-4f16-b541-4e206fb39a80","Type":"ContainerStarted","Data":"79bebe71578805ef368c8fbe7dcb01cea773582c9648ec8f74436488e3afa959"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.284753 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" event={"ID":"edff3052-2bfd-47d9-be42-5d8f608fc529","Type":"ContainerStarted","Data":"408d4ffd159d89808f0881f4fb6eb401194707d9d9289b858e5e60c44a15fe8a"} Sep 30 03:10:16 crc kubenswrapper[4744]: E0930 03:10:16.289528 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" podUID="bcc255bc-d09d-4f16-b541-4e206fb39a80" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.289707 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" event={"ID":"c46f8a8d-f07f-4983-9971-6b06d47c8e38","Type":"ContainerStarted","Data":"8674a6e71dd21600f99e6203369f6f0c3894847e0ec431837d18ffd76bcb5ca0"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.304585 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" event={"ID":"c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad","Type":"ContainerStarted","Data":"70709272f65a374bfbc8c3ea652063695025f00a7f415780a0b0017317adeb19"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.318811 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" event={"ID":"015eb732-be5e-404f-81e2-b43d012c356b","Type":"ContainerStarted","Data":"41a55ca737611f89479601496923a8d603e0cd4b33213498c3257a20f998511b"} Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.318855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" event={"ID":"015eb732-be5e-404f-81e2-b43d012c356b","Type":"ContainerStarted","Data":"9401e8e70c091dade8864790bb5f78ff1ae90cc8451ff2f5f3f3026199c73b77"} Sep 30 03:10:16 crc kubenswrapper[4744]: E0930 03:10:16.321834 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" podUID="015eb732-be5e-404f-81e2-b43d012c356b" Sep 30 03:10:16 crc kubenswrapper[4744]: I0930 03:10:16.376774 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb"] Sep 30 03:10:17 crc kubenswrapper[4744]: I0930 03:10:17.332965 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" event={"ID":"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc","Type":"ContainerStarted","Data":"5247c694ff754a504887fe6563944ea0aa35f3068a2af8f0f65f09bd80775aec"} Sep 30 03:10:17 crc kubenswrapper[4744]: E0930 03:10:17.338723 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" podUID="5ae4d03d-68a5-498a-992f-df43dbeebc73" Sep 30 03:10:17 crc kubenswrapper[4744]: E0930 03:10:17.338854 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" podUID="bcc255bc-d09d-4f16-b541-4e206fb39a80" Sep 30 03:10:17 crc kubenswrapper[4744]: E0930 03:10:17.338907 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" podUID="adaa00a7-7a31-40ae-975e-47306e8128e8" Sep 30 03:10:17 crc kubenswrapper[4744]: E0930 03:10:17.338972 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" podUID="a21e2f23-2adc-4f24-be18-72c39bb6ac8e" Sep 30 03:10:17 crc kubenswrapper[4744]: E0930 03:10:17.338975 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" podUID="bb137b23-9366-4d2c-bc9d-ec50ccaef237" Sep 30 03:10:17 crc kubenswrapper[4744]: E0930 03:10:17.339021 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:a303e460aec09217f90043b8ff19c01061af003b614833b33a593df9c00ddf80\\\"\"" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" podUID="5502e6e0-3d2f-479c-a53f-005bbb749631" Sep 30 03:10:17 crc kubenswrapper[4744]: E0930 03:10:17.339315 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" podUID="015eb732-be5e-404f-81e2-b43d012c356b" Sep 30 03:10:24 crc kubenswrapper[4744]: I0930 03:10:24.529012 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5ff84bd547-bw5gx" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.410326 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" event={"ID":"a6acae25-c5f3-4719-9a0d-866cef31aae8","Type":"ContainerStarted","Data":"8c1e0346efc007a3db8424fa5af2e501a2cba6b2213d031bbf9b777234ddf5a3"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.411722 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" event={"ID":"6293cdef-44a9-4639-a40d-df02e9aa8410","Type":"ContainerStarted","Data":"a5c313cb31a270a4e0fb025c36e35d68065b71c17044917086a8bf0b9cca2be1"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.413241 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" event={"ID":"c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad","Type":"ContainerStarted","Data":"329861b77212b71016173f3e16f742721e4037421c76b8d07a65c3049c8b0023"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.413263 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" event={"ID":"c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad","Type":"ContainerStarted","Data":"ea0ebdff68181d5a83d43a54c236f1e67f6ffa0018eeb711380ddc54881d426b"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.413449 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.426930 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" event={"ID":"48c24f9c-7ad2-4b16-8586-a98cc6f5745d","Type":"ContainerStarted","Data":"48b42a2cd2c569f7ed85178637b9496cc24186811e814fbf125fadd3128c54bb"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.426956 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" event={"ID":"48c24f9c-7ad2-4b16-8586-a98cc6f5745d","Type":"ContainerStarted","Data":"e47f07cf85de78bf5cd5fd549fe9ec5d70a0f2a2e69d018c126f98f142afc796"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.427521 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.430056 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" event={"ID":"befb38ef-208d-435f-820a-787301b3c4b8","Type":"ContainerStarted","Data":"80fc815e254e913b96fae7c7c80a5910e13137b18b7686ad251d94591ccb787a"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.430087 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" event={"ID":"befb38ef-208d-435f-820a-787301b3c4b8","Type":"ContainerStarted","Data":"76a1c1c8f40090f1ad162842509aa39e0998ab5354c7bfe1b975404b3b32653e"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.430236 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.432549 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" event={"ID":"464a78d3-19ea-4024-95f8-65c384a11de5","Type":"ContainerStarted","Data":"c081e79cb3dbf3194fa8e298ed316a31e3357de7ef77bd56084c520d7e96a36c"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.437757 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" event={"ID":"edff3052-2bfd-47d9-be42-5d8f608fc529","Type":"ContainerStarted","Data":"ad8d886b8d03e03ee8afade363e8b0e7aca10c1f73092830b1b10f1293402a07"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.438684 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" podStartSLOduration=3.747330291 podStartE2EDuration="12.438673052s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.257072054 +0000 UTC m=+942.430292048" lastFinishedPulling="2025-09-30 03:10:23.948414835 +0000 UTC m=+951.121634809" observedRunningTime="2025-09-30 03:10:25.433312246 +0000 UTC m=+952.606532220" watchObservedRunningTime="2025-09-30 03:10:25.438673052 +0000 UTC m=+952.611893026" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.440779 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" event={"ID":"be907fa2-e5ce-461e-bad7-7ff67b7b28fc","Type":"ContainerStarted","Data":"dadc43484fc03fe4a221e2f4409887da39a31e0c939ced07e694273afebe22a1"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.440808 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" event={"ID":"be907fa2-e5ce-461e-bad7-7ff67b7b28fc","Type":"ContainerStarted","Data":"61be992c807d3b4ad50a6e5fe825c75b044161448602d339d4998f9eec90cec1"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.441443 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.443359 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" event={"ID":"c46f8a8d-f07f-4983-9971-6b06d47c8e38","Type":"ContainerStarted","Data":"1d716cbc3ccbd81d4720eeab21a9bc4e4868c1de1ec3116809f408ef98bc8ec1"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.446298 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" event={"ID":"2754694b-4135-4439-ae89-dd08166467a5","Type":"ContainerStarted","Data":"3a83545255c76577838bb50d724a188375446ec2121f0de092249c5cc780cfa7"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.446422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" event={"ID":"2754694b-4135-4439-ae89-dd08166467a5","Type":"ContainerStarted","Data":"5df406a244423bb45542c2a7a14e2bdf0c236d49cb911a0e2455d13beb335373"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.446948 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.458434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" event={"ID":"7c60f5e8-9ac8-4729-9030-a17a74c66872","Type":"ContainerStarted","Data":"97498b6744b22361f2d21786bbce7797819c3c191e63f88d498a7e4478fa3b11"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.463800 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" event={"ID":"7955a9b4-f81b-45cd-bc57-b96bef24b064","Type":"ContainerStarted","Data":"fccee40ef707685d99f7191908927d31df1b2161b70e9e0ba05b6fc76d884751"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.475932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" event={"ID":"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc","Type":"ContainerStarted","Data":"645ed1f889037b80bb7ec574e1eac2c6dfcb7fc365ce0857c2ee4ed921fd469e"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.477042 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" podStartSLOduration=3.322678527 podStartE2EDuration="12.477028562s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:14.737489613 +0000 UTC m=+941.910709587" lastFinishedPulling="2025-09-30 03:10:23.891839648 +0000 UTC m=+951.065059622" observedRunningTime="2025-09-30 03:10:25.476205637 +0000 UTC m=+952.649425611" watchObservedRunningTime="2025-09-30 03:10:25.477028562 +0000 UTC m=+952.650248536" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.478250 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" podStartSLOduration=3.809319344 podStartE2EDuration="12.4782445s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.230930302 +0000 UTC m=+942.404150286" lastFinishedPulling="2025-09-30 03:10:23.899855468 +0000 UTC m=+951.073075442" observedRunningTime="2025-09-30 03:10:25.456256808 +0000 UTC m=+952.629476782" watchObservedRunningTime="2025-09-30 03:10:25.4782445 +0000 UTC m=+952.651464474" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.491756 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" event={"ID":"5c70f54b-6405-4dcc-a2d2-e989b2516f0e","Type":"ContainerStarted","Data":"f9e617d49a7e215aa97ca4f7ea73beabdb9ded4a1258aa40bb6c85a721530568"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.491802 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" event={"ID":"5c70f54b-6405-4dcc-a2d2-e989b2516f0e","Type":"ContainerStarted","Data":"8c71df32e5880775bc06f2ff893b0affd815bfe3c88d0b8541be7b2736e9bb8d"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.492418 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.508153 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" podStartSLOduration=3.764823253 podStartE2EDuration="12.508138289s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.230766276 +0000 UTC m=+942.403986240" lastFinishedPulling="2025-09-30 03:10:23.974081302 +0000 UTC m=+951.147301276" observedRunningTime="2025-09-30 03:10:25.505964921 +0000 UTC m=+952.679184895" watchObservedRunningTime="2025-09-30 03:10:25.508138289 +0000 UTC m=+952.681358263" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.523488 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.523523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" event={"ID":"1416686e-3057-4219-93e8-b6ed99e1b000","Type":"ContainerStarted","Data":"e4d1b88a02f784a7e53267afffd527778dbf47bb56e9aa980ca91ed9eae5db59"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.523538 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" event={"ID":"1416686e-3057-4219-93e8-b6ed99e1b000","Type":"ContainerStarted","Data":"9e56a1178e4ea7c280e98bfa2a8af1ddcc11c997c9587f83b08cb2623b56ac73"} Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.537096 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" podStartSLOduration=3.369433768 podStartE2EDuration="12.537079357s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:14.781404306 +0000 UTC m=+941.954624280" lastFinishedPulling="2025-09-30 03:10:23.949049895 +0000 UTC m=+951.122269869" observedRunningTime="2025-09-30 03:10:25.533260768 +0000 UTC m=+952.706480742" watchObservedRunningTime="2025-09-30 03:10:25.537079357 +0000 UTC m=+952.710299331" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.563769 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" podStartSLOduration=3.807631142 podStartE2EDuration="12.563753035s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.182130336 +0000 UTC m=+942.355350310" lastFinishedPulling="2025-09-30 03:10:23.938252229 +0000 UTC m=+951.111472203" observedRunningTime="2025-09-30 03:10:25.558601675 +0000 UTC m=+952.731821659" watchObservedRunningTime="2025-09-30 03:10:25.563753035 +0000 UTC m=+952.736973009" Sep 30 03:10:25 crc kubenswrapper[4744]: I0930 03:10:25.588227 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" podStartSLOduration=3.870091791 podStartE2EDuration="12.588211194s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.23088679 +0000 UTC m=+942.404106764" lastFinishedPulling="2025-09-30 03:10:23.949006183 +0000 UTC m=+951.122226167" observedRunningTime="2025-09-30 03:10:25.58744755 +0000 UTC m=+952.760667534" watchObservedRunningTime="2025-09-30 03:10:25.588211194 +0000 UTC m=+952.761431168" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.524865 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" event={"ID":"6293cdef-44a9-4639-a40d-df02e9aa8410","Type":"ContainerStarted","Data":"ecf7dacb421c02bd6a2910fa2ec4464a5de4cb2c0874887f13d3eba86312ce98"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.525023 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.527791 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" event={"ID":"7c60f5e8-9ac8-4729-9030-a17a74c66872","Type":"ContainerStarted","Data":"5fa5499d16751e3905ff574c5bfd10b21510889f1f2d49f6fa3e26997b05bfb4"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.527927 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.530249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" event={"ID":"7955a9b4-f81b-45cd-bc57-b96bef24b064","Type":"ContainerStarted","Data":"a19193550a3274aca7ebfa6bc13fe31e296bab4006faac157abc8d2c415493ef"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.530419 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.536054 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" event={"ID":"464a78d3-19ea-4024-95f8-65c384a11de5","Type":"ContainerStarted","Data":"cd17c2a98ce56fd0a7ef93443d362b3314bcc52997d96c69d9ce34bc48de46d4"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.536252 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.538691 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" event={"ID":"edff3052-2bfd-47d9-be42-5d8f608fc529","Type":"ContainerStarted","Data":"0302b15d339498eb9148bf5bd828ad9009082fe5d63506eef3c0920c3bf5af2d"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.538858 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.540785 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" event={"ID":"2fdc94bc-95cf-4a16-a6cc-0d277f4969bc","Type":"ContainerStarted","Data":"690e3c61d9ad7c9b7e0ae4233d5fb9e704fa62c2ad770a6937589e18951e86dc"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.540929 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.542987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" event={"ID":"c46f8a8d-f07f-4983-9971-6b06d47c8e38","Type":"ContainerStarted","Data":"27d35ba79840687a5a2447ddd8c19e31f348b777c03bbb467cf26a7969c029d3"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.543189 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.546296 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" event={"ID":"a6acae25-c5f3-4719-9a0d-866cef31aae8","Type":"ContainerStarted","Data":"d1e3c24e7c7c78416c58ca0abf18e518dd7bc1c9028a6c5f258583ecb093a590"} Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.547173 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" podStartSLOduration=5.154728974 podStartE2EDuration="13.547162176s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.583119736 +0000 UTC m=+942.756339710" lastFinishedPulling="2025-09-30 03:10:23.975552938 +0000 UTC m=+951.148772912" observedRunningTime="2025-09-30 03:10:26.545825615 +0000 UTC m=+953.719045619" watchObservedRunningTime="2025-09-30 03:10:26.547162176 +0000 UTC m=+953.720382150" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.580007 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" podStartSLOduration=4.828976172 podStartE2EDuration="13.579983685s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.22408631 +0000 UTC m=+942.397306284" lastFinishedPulling="2025-09-30 03:10:23.975093823 +0000 UTC m=+951.148313797" observedRunningTime="2025-09-30 03:10:26.574835145 +0000 UTC m=+953.748055159" watchObservedRunningTime="2025-09-30 03:10:26.579983685 +0000 UTC m=+953.753203699" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.632470 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" podStartSLOduration=4.497589774 podStartE2EDuration="13.632450304s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:14.801261143 +0000 UTC m=+941.974481117" lastFinishedPulling="2025-09-30 03:10:23.936121663 +0000 UTC m=+951.109341647" observedRunningTime="2025-09-30 03:10:26.610090559 +0000 UTC m=+953.783310573" watchObservedRunningTime="2025-09-30 03:10:26.632450304 +0000 UTC m=+953.805670298" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.634650 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" podStartSLOduration=5.1531850949999995 podStartE2EDuration="13.634640662s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.508318463 +0000 UTC m=+942.681538437" lastFinishedPulling="2025-09-30 03:10:23.98977402 +0000 UTC m=+951.162994004" observedRunningTime="2025-09-30 03:10:26.62975972 +0000 UTC m=+953.802979724" watchObservedRunningTime="2025-09-30 03:10:26.634640662 +0000 UTC m=+953.807860656" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.670234 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" podStartSLOduration=6.082107966 podStartE2EDuration="13.670211486s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:16.406569541 +0000 UTC m=+943.579789515" lastFinishedPulling="2025-09-30 03:10:23.994673061 +0000 UTC m=+951.167893035" observedRunningTime="2025-09-30 03:10:26.664965273 +0000 UTC m=+953.838185287" watchObservedRunningTime="2025-09-30 03:10:26.670211486 +0000 UTC m=+953.843431490" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.697070 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" podStartSLOduration=4.971931028 podStartE2EDuration="13.697042669s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.251089487 +0000 UTC m=+942.424309461" lastFinishedPulling="2025-09-30 03:10:23.976201118 +0000 UTC m=+951.149421102" observedRunningTime="2025-09-30 03:10:26.691116085 +0000 UTC m=+953.864336089" watchObservedRunningTime="2025-09-30 03:10:26.697042669 +0000 UTC m=+953.870262673" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.716482 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" podStartSLOduration=4.5351429880000005 podStartE2EDuration="13.716461822s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:14.799115386 +0000 UTC m=+941.972335360" lastFinishedPulling="2025-09-30 03:10:23.98043421 +0000 UTC m=+951.153654194" observedRunningTime="2025-09-30 03:10:26.709705572 +0000 UTC m=+953.882925556" watchObservedRunningTime="2025-09-30 03:10:26.716461822 +0000 UTC m=+953.889681826" Sep 30 03:10:26 crc kubenswrapper[4744]: I0930 03:10:26.735176 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" podStartSLOduration=5.060944882 podStartE2EDuration="13.735148432s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.263790102 +0000 UTC m=+942.437010076" lastFinishedPulling="2025-09-30 03:10:23.937993652 +0000 UTC m=+951.111213626" observedRunningTime="2025-09-30 03:10:26.728633209 +0000 UTC m=+953.901853193" watchObservedRunningTime="2025-09-30 03:10:26.735148432 +0000 UTC m=+953.908368446" Sep 30 03:10:27 crc kubenswrapper[4744]: I0930 03:10:27.559121 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" Sep 30 03:10:32 crc kubenswrapper[4744]: I0930 03:10:32.607909 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" event={"ID":"a21e2f23-2adc-4f24-be18-72c39bb6ac8e","Type":"ContainerStarted","Data":"55cec94e5d3a85b89cbc9c023bbbcab8969482a7316df7d4b7d2bc14289e4140"} Sep 30 03:10:32 crc kubenswrapper[4744]: I0930 03:10:32.608896 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:32 crc kubenswrapper[4744]: I0930 03:10:32.626516 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" podStartSLOduration=2.843423427 podStartE2EDuration="19.626501445s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.536632273 +0000 UTC m=+942.709852247" lastFinishedPulling="2025-09-30 03:10:32.319710291 +0000 UTC m=+959.492930265" observedRunningTime="2025-09-30 03:10:32.624142112 +0000 UTC m=+959.797362096" watchObservedRunningTime="2025-09-30 03:10:32.626501445 +0000 UTC m=+959.799721419" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.615504 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" event={"ID":"015eb732-be5e-404f-81e2-b43d012c356b","Type":"ContainerStarted","Data":"e901539bfbf87a2899a77b19fbab77dd41875fe32126b6abb38cd4ef38147d48"} Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.616412 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.649868 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" podStartSLOduration=3.575328991 podStartE2EDuration="20.649853506s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.289130139 +0000 UTC m=+942.462350103" lastFinishedPulling="2025-09-30 03:10:32.363654654 +0000 UTC m=+959.536874618" observedRunningTime="2025-09-30 03:10:33.648759392 +0000 UTC m=+960.821979366" watchObservedRunningTime="2025-09-30 03:10:33.649853506 +0000 UTC m=+960.823073480" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.661628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" event={"ID":"adaa00a7-7a31-40ae-975e-47306e8128e8","Type":"ContainerStarted","Data":"6a0880b9a5bceba956e796aaa8cd9a876aef720b5f24a08c9921bbf76a8fd0d5"} Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.662211 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.691594 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" podStartSLOduration=3.5668179049999997 podStartE2EDuration="20.691571351s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.268831198 +0000 UTC m=+942.442051172" lastFinishedPulling="2025-09-30 03:10:32.393584644 +0000 UTC m=+959.566804618" observedRunningTime="2025-09-30 03:10:33.687006279 +0000 UTC m=+960.860226253" watchObservedRunningTime="2025-09-30 03:10:33.691571351 +0000 UTC m=+960.864791325" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.722786 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-nh8lm" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.737071 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lfgcl" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.765237 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-r4pn4" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.820222 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-f5wr4" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.838528 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jrs8k" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.853531 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jrmql" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.908762 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-plwv5" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.933725 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-n4bm8" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.934997 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-xc7mx" Sep 30 03:10:33 crc kubenswrapper[4744]: I0930 03:10:33.967796 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5whrj" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.013639 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-hjmhz" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.123835 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-g6w7n" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.198188 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-kd8v7" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.347648 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.347702 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.347739 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.348135 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a5c6bc379bf988ae0369b42f93fd361d89694e20343a5b27933e4ef1594e651"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.348187 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://8a5c6bc379bf988ae0369b42f93fd361d89694e20343a5b27933e4ef1594e651" gracePeriod=600 Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.477192 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-pvzfj" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.670693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" event={"ID":"bb137b23-9366-4d2c-bc9d-ec50ccaef237","Type":"ContainerStarted","Data":"7ab97e6fefb06c52feb448ea24ea6dcda4aca7139bc8a73ab3f7df051d1ace92"} Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.675110 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="8a5c6bc379bf988ae0369b42f93fd361d89694e20343a5b27933e4ef1594e651" exitCode=0 Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.675189 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"8a5c6bc379bf988ae0369b42f93fd361d89694e20343a5b27933e4ef1594e651"} Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.675240 4744 scope.go:117] "RemoveContainer" containerID="30f20d65f55e83fb7df6fb2f203d982a107f210e9c52e670591915139c564a0e" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.678218 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" event={"ID":"5502e6e0-3d2f-479c-a53f-005bbb749631","Type":"ContainerStarted","Data":"ab629a91fa05f7e80e89a045ca6bc6bfce5cb0f891091b8b2454c5df7c21a5c5"} Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.679225 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.683924 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" event={"ID":"5ae4d03d-68a5-498a-992f-df43dbeebc73","Type":"ContainerStarted","Data":"c34b0d38b87849e2eb7d831dbf0321bf9b0476491cf27dbe0b553a96dd90bd7b"} Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.684647 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.693095 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-2vpmm" podStartSLOduration=3.836971401 podStartE2EDuration="20.693075694s" podCreationTimestamp="2025-09-30 03:10:14 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.536642524 +0000 UTC m=+942.709862498" lastFinishedPulling="2025-09-30 03:10:32.392746767 +0000 UTC m=+959.565966791" observedRunningTime="2025-09-30 03:10:34.692155085 +0000 UTC m=+961.865375059" watchObservedRunningTime="2025-09-30 03:10:34.693075694 +0000 UTC m=+961.866295668" Sep 30 03:10:34 crc kubenswrapper[4744]: I0930 03:10:34.711635 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" podStartSLOduration=3.653657722 podStartE2EDuration="21.711615099s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.509764599 +0000 UTC m=+942.682984573" lastFinishedPulling="2025-09-30 03:10:33.567721976 +0000 UTC m=+960.740941950" observedRunningTime="2025-09-30 03:10:34.710525315 +0000 UTC m=+961.883745289" watchObservedRunningTime="2025-09-30 03:10:34.711615099 +0000 UTC m=+961.884835073" Sep 30 03:10:35 crc kubenswrapper[4744]: I0930 03:10:35.695923 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" event={"ID":"bcc255bc-d09d-4f16-b541-4e206fb39a80","Type":"ContainerStarted","Data":"5f7537022c6c63cb7f15403728d22a071a970163116b65158bfb3de564a132dd"} Sep 30 03:10:35 crc kubenswrapper[4744]: I0930 03:10:35.696633 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" Sep 30 03:10:35 crc kubenswrapper[4744]: I0930 03:10:35.700571 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"1d72e9221a902ba71a0038b939d0d12d57f148cf38a3a98c9981e273e6748a54"} Sep 30 03:10:35 crc kubenswrapper[4744]: I0930 03:10:35.724109 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" podStartSLOduration=4.470336027 podStartE2EDuration="22.724079603s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.32042092 +0000 UTC m=+942.493640894" lastFinishedPulling="2025-09-30 03:10:33.574164496 +0000 UTC m=+960.747384470" observedRunningTime="2025-09-30 03:10:34.732685634 +0000 UTC m=+961.905905608" watchObservedRunningTime="2025-09-30 03:10:35.724079603 +0000 UTC m=+962.897299617" Sep 30 03:10:35 crc kubenswrapper[4744]: I0930 03:10:35.729183 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" podStartSLOduration=3.247735189 podStartE2EDuration="22.72916143s" podCreationTimestamp="2025-09-30 03:10:13 +0000 UTC" firstStartedPulling="2025-09-30 03:10:15.268328283 +0000 UTC m=+942.441548277" lastFinishedPulling="2025-09-30 03:10:34.749754544 +0000 UTC m=+961.922974518" observedRunningTime="2025-09-30 03:10:35.719544552 +0000 UTC m=+962.892764556" watchObservedRunningTime="2025-09-30 03:10:35.72916143 +0000 UTC m=+962.902381464" Sep 30 03:10:35 crc kubenswrapper[4744]: I0930 03:10:35.765418 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-gmkqb" Sep 30 03:10:44 crc kubenswrapper[4744]: I0930 03:10:44.001066 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-hl4qt" Sep 30 03:10:44 crc kubenswrapper[4744]: I0930 03:10:44.116901 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-zvxch" Sep 30 03:10:44 crc kubenswrapper[4744]: I0930 03:10:44.138029 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-pcn79" Sep 30 03:10:44 crc kubenswrapper[4744]: I0930 03:10:44.431565 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-5fx9s" Sep 30 03:10:44 crc kubenswrapper[4744]: I0930 03:10:44.448400 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-p5zk9" Sep 30 03:10:44 crc kubenswrapper[4744]: I0930 03:10:44.499664 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-z8f6l" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.770526 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fknzx"] Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.772995 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.775748 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.779837 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fknzx"] Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.782449 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4rhnn" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.810035 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk96t\" (UniqueName: \"kubernetes.io/projected/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-kube-api-access-sk96t\") pod \"dnsmasq-dns-675f4bcbfc-fknzx\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.810148 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-config\") pod \"dnsmasq-dns-675f4bcbfc-fknzx\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.828330 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4d64w"] Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.837206 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.838844 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.840487 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4d64w"] Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.911239 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nl7b\" (UniqueName: \"kubernetes.io/projected/14f44152-2c4d-4b63-9bac-0b626ed31685-kube-api-access-9nl7b\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.911423 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-config\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.911660 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-config\") pod \"dnsmasq-dns-675f4bcbfc-fknzx\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.911755 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.911827 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk96t\" (UniqueName: \"kubernetes.io/projected/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-kube-api-access-sk96t\") pod \"dnsmasq-dns-675f4bcbfc-fknzx\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.923960 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-config\") pod \"dnsmasq-dns-675f4bcbfc-fknzx\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:10:59 crc kubenswrapper[4744]: I0930 03:10:59.954404 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk96t\" (UniqueName: \"kubernetes.io/projected/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-kube-api-access-sk96t\") pod \"dnsmasq-dns-675f4bcbfc-fknzx\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.013499 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.013601 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nl7b\" (UniqueName: \"kubernetes.io/projected/14f44152-2c4d-4b63-9bac-0b626ed31685-kube-api-access-9nl7b\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.013628 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-config\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.014652 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-config\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.015156 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.031343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nl7b\" (UniqueName: \"kubernetes.io/projected/14f44152-2c4d-4b63-9bac-0b626ed31685-kube-api-access-9nl7b\") pod \"dnsmasq-dns-78dd6ddcc-4d64w\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.101444 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.150335 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.564583 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fknzx"] Sep 30 03:11:00 crc kubenswrapper[4744]: W0930 03:11:00.573175 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3a5f1a2_ee44_43e4_9a5f_da28cdd3585e.slice/crio-9b7e9815e4d6f2489b400e554239bc8221512e958650dd3f04be51402ad2c88c WatchSource:0}: Error finding container 9b7e9815e4d6f2489b400e554239bc8221512e958650dd3f04be51402ad2c88c: Status 404 returned error can't find the container with id 9b7e9815e4d6f2489b400e554239bc8221512e958650dd3f04be51402ad2c88c Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.646148 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4d64w"] Sep 30 03:11:00 crc kubenswrapper[4744]: W0930 03:11:00.652465 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14f44152_2c4d_4b63_9bac_0b626ed31685.slice/crio-cb2bb423243347293d4e09a359b258597dfaa71a94c0f424d4311be56b4e1891 WatchSource:0}: Error finding container cb2bb423243347293d4e09a359b258597dfaa71a94c0f424d4311be56b4e1891: Status 404 returned error can't find the container with id cb2bb423243347293d4e09a359b258597dfaa71a94c0f424d4311be56b4e1891 Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.960429 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" event={"ID":"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e","Type":"ContainerStarted","Data":"9b7e9815e4d6f2489b400e554239bc8221512e958650dd3f04be51402ad2c88c"} Sep 30 03:11:00 crc kubenswrapper[4744]: I0930 03:11:00.966670 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" event={"ID":"14f44152-2c4d-4b63-9bac-0b626ed31685","Type":"ContainerStarted","Data":"cb2bb423243347293d4e09a359b258597dfaa71a94c0f424d4311be56b4e1891"} Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.554815 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fknzx"] Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.582238 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kbhdw"] Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.585031 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.599585 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kbhdw"] Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.751136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-config\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.751231 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9w4\" (UniqueName: \"kubernetes.io/projected/ea6885ed-b36b-4593-9cb2-43f9c1cef850-kube-api-access-nh9w4\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.751279 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.855712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9w4\" (UniqueName: \"kubernetes.io/projected/ea6885ed-b36b-4593-9cb2-43f9c1cef850-kube-api-access-nh9w4\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.855765 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.855827 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-config\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.856694 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-config\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.857532 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.868028 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4d64w"] Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.890282 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9w4\" (UniqueName: \"kubernetes.io/projected/ea6885ed-b36b-4593-9cb2-43f9c1cef850-kube-api-access-nh9w4\") pod \"dnsmasq-dns-5ccc8479f9-kbhdw\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.908297 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2wcjr"] Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.909824 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.915894 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2wcjr"] Sep 30 03:11:02 crc kubenswrapper[4744]: I0930 03:11:02.928430 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.059422 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdwz9\" (UniqueName: \"kubernetes.io/projected/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-kube-api-access-rdwz9\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.060499 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-config\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.060561 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.162092 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.162246 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwz9\" (UniqueName: \"kubernetes.io/projected/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-kube-api-access-rdwz9\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.162288 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-config\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.165026 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.165423 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-config\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.182846 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwz9\" (UniqueName: \"kubernetes.io/projected/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-kube-api-access-rdwz9\") pod \"dnsmasq-dns-57d769cc4f-2wcjr\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.229878 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.237017 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kbhdw"] Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.554381 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2wcjr"] Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.751042 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.752970 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.754743 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.754972 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-z6w4x" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.757064 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.757436 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.757645 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.757813 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.758063 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.778305 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2d0096-8154-4723-aa53-80eaeb9e4d32-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883171 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883200 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883226 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883246 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883264 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883285 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883299 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2cp\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-kube-api-access-bk2cp\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883325 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883363 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.883408 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2d0096-8154-4723-aa53-80eaeb9e4d32-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984110 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984155 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984174 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2cp\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-kube-api-access-bk2cp\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984239 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984269 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2d0096-8154-4723-aa53-80eaeb9e4d32-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2d0096-8154-4723-aa53-80eaeb9e4d32-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984329 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984353 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.984391 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.985266 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.985555 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.985865 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.986736 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.986777 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:03 crc kubenswrapper[4744]: I0930 03:11:03.986812 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.000643 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2d0096-8154-4723-aa53-80eaeb9e4d32-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.002019 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" event={"ID":"a9f869fc-4cf8-45c9-9eec-f2756338f2d9","Type":"ContainerStarted","Data":"6247b7a9f36e05b6dad1f9cb305eed1f647e43490762d6acebaf8ae23db7172f"} Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.003805 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" event={"ID":"ea6885ed-b36b-4593-9cb2-43f9c1cef850","Type":"ContainerStarted","Data":"1677a1a7309600c40d02aaeb9e2d87ebf34d7b11967f284ccedc1839335c2982"} Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.005613 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2d0096-8154-4723-aa53-80eaeb9e4d32-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.006012 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2cp\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-kube-api-access-bk2cp\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.008804 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.014549 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.041980 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.062129 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.063662 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.066243 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.066413 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.066544 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.066656 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.066822 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9m667" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.066952 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.068285 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.078510 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.091035 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.193172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79aeb9a3-f29e-49f0-af59-ae29868cc21e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.193826 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.193848 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.193980 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.194023 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.194069 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-config-data\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.194086 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.194181 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.194238 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.194360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79aeb9a3-f29e-49f0-af59-ae29868cc21e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.194477 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4vmc\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-kube-api-access-j4vmc\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297307 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297358 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297441 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79aeb9a3-f29e-49f0-af59-ae29868cc21e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297468 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4vmc\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-kube-api-access-j4vmc\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297499 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79aeb9a3-f29e-49f0-af59-ae29868cc21e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297549 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297568 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297606 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297620 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297641 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-config-data\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.297655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.298470 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.299274 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.299279 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.301324 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-config-data\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.301773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.302415 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.303809 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79aeb9a3-f29e-49f0-af59-ae29868cc21e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.304441 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.305392 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.315757 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4vmc\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-kube-api-access-j4vmc\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.320486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79aeb9a3-f29e-49f0-af59-ae29868cc21e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.322586 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:11:04 crc kubenswrapper[4744]: I0930 03:11:04.398894 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.976590 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.978998 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.984749 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.984934 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qpdlb" Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.984951 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.985236 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.987918 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.984273 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 03:11:06 crc kubenswrapper[4744]: I0930 03:11:06.994974 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.095515 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.102436 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.106602 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-68z2v" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.108072 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.108330 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.108415 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.114455 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138057 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138103 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-config-data-default\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138133 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-secrets\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138180 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138201 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138216 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-kolla-config\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138229 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138258 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.138277 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2zms\" (UniqueName: \"kubernetes.io/projected/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-kube-api-access-h2zms\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239394 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239432 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2zms\" (UniqueName: \"kubernetes.io/projected/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-kube-api-access-h2zms\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239506 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1131b4e-532d-478b-bbd8-b52963f60462-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239544 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239567 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-config-data-default\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239602 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239639 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-secrets\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239737 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239739 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gvxn\" (UniqueName: \"kubernetes.io/projected/c1131b4e-532d-478b-bbd8-b52963f60462-kube-api-access-9gvxn\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239870 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239910 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.239998 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240069 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240095 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-kolla-config\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240114 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240490 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240618 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-config-data-default\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.240968 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-kolla-config\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.241210 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.246531 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-secrets\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.257227 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.258977 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.260649 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2zms\" (UniqueName: \"kubernetes.io/projected/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-kube-api-access-h2zms\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.262531 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.262781 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hhpfs" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.263001 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.263361 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.263440 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf3db46-b4d2-469a-bc2e-dc5610bb2807-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.268820 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ddf3db46-b4d2-469a-bc2e-dc5610bb2807\") " pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.289403 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.310692 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.341895 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gvxn\" (UniqueName: \"kubernetes.io/projected/c1131b4e-532d-478b-bbd8-b52963f60462-kube-api-access-9gvxn\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.341951 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.341978 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.342012 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.342035 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.342076 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.342108 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1131b4e-532d-478b-bbd8-b52963f60462-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.342136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.342151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.344012 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.344115 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.344631 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1131b4e-532d-478b-bbd8-b52963f60462-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.344759 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.345642 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1131b4e-532d-478b-bbd8-b52963f60462-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.346631 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.347683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.351138 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1131b4e-532d-478b-bbd8-b52963f60462-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.364290 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gvxn\" (UniqueName: \"kubernetes.io/projected/c1131b4e-532d-478b-bbd8-b52963f60462-kube-api-access-9gvxn\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.382220 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c1131b4e-532d-478b-bbd8-b52963f60462\") " pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.421571 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.443061 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.443147 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-config-data\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.443191 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.443226 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-kolla-config\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.443267 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw5wx\" (UniqueName: \"kubernetes.io/projected/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-kube-api-access-xw5wx\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.544560 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.544648 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-config-data\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.544686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.544717 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-kolla-config\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.544768 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw5wx\" (UniqueName: \"kubernetes.io/projected/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-kube-api-access-xw5wx\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.545602 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-kolla-config\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.545872 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-config-data\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.549091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.549312 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.563269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw5wx\" (UniqueName: \"kubernetes.io/projected/0eb33cf6-e46d-4f10-b794-6707d21fc4ab-kube-api-access-xw5wx\") pod \"memcached-0\" (UID: \"0eb33cf6-e46d-4f10-b794-6707d21fc4ab\") " pod="openstack/memcached-0" Sep 30 03:11:07 crc kubenswrapper[4744]: I0930 03:11:07.633684 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.065544 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.066923 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.072099 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7vhl7" Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.073459 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.167130 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhsmk\" (UniqueName: \"kubernetes.io/projected/a6379476-59f6-4c51-8f3f-7ea563d15030-kube-api-access-fhsmk\") pod \"kube-state-metrics-0\" (UID: \"a6379476-59f6-4c51-8f3f-7ea563d15030\") " pod="openstack/kube-state-metrics-0" Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.269159 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhsmk\" (UniqueName: \"kubernetes.io/projected/a6379476-59f6-4c51-8f3f-7ea563d15030-kube-api-access-fhsmk\") pod \"kube-state-metrics-0\" (UID: \"a6379476-59f6-4c51-8f3f-7ea563d15030\") " pod="openstack/kube-state-metrics-0" Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.308934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhsmk\" (UniqueName: \"kubernetes.io/projected/a6379476-59f6-4c51-8f3f-7ea563d15030-kube-api-access-fhsmk\") pod \"kube-state-metrics-0\" (UID: \"a6379476-59f6-4c51-8f3f-7ea563d15030\") " pod="openstack/kube-state-metrics-0" Sep 30 03:11:09 crc kubenswrapper[4744]: I0930 03:11:09.386949 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.264896 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m95jr"] Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.266665 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.269446 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-wz5nf" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.269533 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.269464 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.286382 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-t9l7c"] Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.287957 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.295900 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m95jr"] Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.310201 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t9l7c"] Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.370060 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-run-ovn\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.370136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-log-ovn\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.370168 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-run\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.370249 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa7757e-eced-4195-8b1d-88fd7a3b322d-combined-ca-bundle\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.371024 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa7757e-eced-4195-8b1d-88fd7a3b322d-ovn-controller-tls-certs\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.371075 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa7757e-eced-4195-8b1d-88fd7a3b322d-scripts\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.371113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2jb\" (UniqueName: \"kubernetes.io/projected/6aa7757e-eced-4195-8b1d-88fd7a3b322d-kube-api-access-cw2jb\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485482 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa7757e-eced-4195-8b1d-88fd7a3b322d-scripts\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485549 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2jb\" (UniqueName: \"kubernetes.io/projected/6aa7757e-eced-4195-8b1d-88fd7a3b322d-kube-api-access-cw2jb\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485592 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-run-ovn\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485619 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-run\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485643 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-log-ovn\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485660 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-etc-ovs\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485678 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-run\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485699 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-lib\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485726 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-log\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485746 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6fx\" (UniqueName: \"kubernetes.io/projected/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-kube-api-access-zb6fx\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485762 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa7757e-eced-4195-8b1d-88fd7a3b322d-combined-ca-bundle\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485781 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-scripts\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.485803 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa7757e-eced-4195-8b1d-88fd7a3b322d-ovn-controller-tls-certs\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.487196 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-run\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.487293 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-run-ovn\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.487515 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa7757e-eced-4195-8b1d-88fd7a3b322d-var-log-ovn\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.489539 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa7757e-eced-4195-8b1d-88fd7a3b322d-scripts\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.491640 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aa7757e-eced-4195-8b1d-88fd7a3b322d-ovn-controller-tls-certs\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.491797 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aa7757e-eced-4195-8b1d-88fd7a3b322d-combined-ca-bundle\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.507881 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2jb\" (UniqueName: \"kubernetes.io/projected/6aa7757e-eced-4195-8b1d-88fd7a3b322d-kube-api-access-cw2jb\") pod \"ovn-controller-m95jr\" (UID: \"6aa7757e-eced-4195-8b1d-88fd7a3b322d\") " pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.587374 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6fx\" (UniqueName: \"kubernetes.io/projected/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-kube-api-access-zb6fx\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.587918 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-scripts\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.587469 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588097 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-run\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588162 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-etc-ovs\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588349 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-run\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588365 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-lib\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-log\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588603 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-log\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588659 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-var-lib\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.588658 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-etc-ovs\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.589883 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-scripts\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.602706 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6fx\" (UniqueName: \"kubernetes.io/projected/f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064-kube-api-access-zb6fx\") pod \"ovn-controller-ovs-t9l7c\" (UID: \"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064\") " pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:13 crc kubenswrapper[4744]: I0930 03:11:13.603610 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:15 crc kubenswrapper[4744]: E0930 03:11:15.555506 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 03:11:15 crc kubenswrapper[4744]: E0930 03:11:15.556923 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nl7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4d64w_openstack(14f44152-2c4d-4b63-9bac-0b626ed31685): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:11:15 crc kubenswrapper[4744]: E0930 03:11:15.558858 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" podUID="14f44152-2c4d-4b63-9bac-0b626ed31685" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.566740 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.571184 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.576407 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.576792 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.577375 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.577873 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.577613 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5vrzw" Sep 30 03:11:15 crc kubenswrapper[4744]: E0930 03:11:15.588304 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 03:11:15 crc kubenswrapper[4744]: E0930 03:11:15.588566 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sk96t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fknzx_openstack(c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:11:15 crc kubenswrapper[4744]: E0930 03:11:15.590935 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" podUID="c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.603003 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.723612 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-config\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.724088 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.724166 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.724194 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.724217 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.724242 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.724271 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcqk\" (UniqueName: \"kubernetes.io/projected/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-kube-api-access-frcqk\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.724312 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.780207 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.791605 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.791857 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.797738 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qw78k" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.797994 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.798265 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.798680 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825331 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825375 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825409 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825446 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcqk\" (UniqueName: \"kubernetes.io/projected/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-kube-api-access-frcqk\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825474 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825511 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-config\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.825530 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.826299 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.826568 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.826957 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-config\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.827259 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.829626 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.830084 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.835913 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.846124 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcqk\" (UniqueName: \"kubernetes.io/projected/7bfc1c21-6422-4308-8370-2dd0b26a3c1e-kube-api-access-frcqk\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.850461 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7bfc1c21-6422-4308-8370-2dd0b26a3c1e\") " pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.903130 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.926926 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.926987 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e0e55f0-f333-4bc6-9905-18adf601fb9c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.927012 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0e55f0-f333-4bc6-9905-18adf601fb9c-config\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.927029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.927047 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.927070 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwv9\" (UniqueName: \"kubernetes.io/projected/1e0e55f0-f333-4bc6-9905-18adf601fb9c-kube-api-access-rhwv9\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.927103 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:15 crc kubenswrapper[4744]: I0930 03:11:15.927139 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e0e55f0-f333-4bc6-9905-18adf601fb9c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028605 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028675 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e0e55f0-f333-4bc6-9905-18adf601fb9c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0e55f0-f333-4bc6-9905-18adf601fb9c-config\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028728 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028751 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwv9\" (UniqueName: \"kubernetes.io/projected/1e0e55f0-f333-4bc6-9905-18adf601fb9c-kube-api-access-rhwv9\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028848 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028894 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e0e55f0-f333-4bc6-9905-18adf601fb9c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.028979 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.029318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1e0e55f0-f333-4bc6-9905-18adf601fb9c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.029736 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e0e55f0-f333-4bc6-9905-18adf601fb9c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.030886 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0e55f0-f333-4bc6-9905-18adf601fb9c-config\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.034222 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.041027 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.042455 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e0e55f0-f333-4bc6-9905-18adf601fb9c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.046089 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwv9\" (UniqueName: \"kubernetes.io/projected/1e0e55f0-f333-4bc6-9905-18adf601fb9c-kube-api-access-rhwv9\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.051287 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1e0e55f0-f333-4bc6-9905-18adf601fb9c\") " pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.131082 4744 generic.go:334] "Generic (PLEG): container finished" podID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerID="24f5619a5975cd51a8e3b5fea396cd20cfb24b932d853638e967b46f49c88350" exitCode=0 Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.131135 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" event={"ID":"a9f869fc-4cf8-45c9-9eec-f2756338f2d9","Type":"ContainerDied","Data":"24f5619a5975cd51a8e3b5fea396cd20cfb24b932d853638e967b46f49c88350"} Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.136636 4744 generic.go:334] "Generic (PLEG): container finished" podID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerID="7c27945ce2290b78b4a4f53d659de5ef55990762e9cf0a30759f73f7e0d01711" exitCode=0 Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.136776 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" event={"ID":"ea6885ed-b36b-4593-9cb2-43f9c1cef850","Type":"ContainerDied","Data":"7c27945ce2290b78b4a4f53d659de5ef55990762e9cf0a30759f73f7e0d01711"} Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.188975 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.209113 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.225638 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.235674 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 03:11:16 crc kubenswrapper[4744]: W0930 03:11:16.251861 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79aeb9a3_f29e_49f0_af59_ae29868cc21e.slice/crio-e092da75813a67de8bf2b242c3ec544d741c373cff63dcb25bf56cf34b2b81e5 WatchSource:0}: Error finding container e092da75813a67de8bf2b242c3ec544d741c373cff63dcb25bf56cf34b2b81e5: Status 404 returned error can't find the container with id e092da75813a67de8bf2b242c3ec544d741c373cff63dcb25bf56cf34b2b81e5 Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.646555 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.656027 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.669377 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 03:11:16 crc kubenswrapper[4744]: W0930 03:11:16.676998 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf3db46_b4d2_469a_bc2e_dc5610bb2807.slice/crio-b275c0133f69e0d21344263051b6f2992eeba3ae3dabfc00751042cb1e616e9b WatchSource:0}: Error finding container b275c0133f69e0d21344263051b6f2992eeba3ae3dabfc00751042cb1e616e9b: Status 404 returned error can't find the container with id b275c0133f69e0d21344263051b6f2992eeba3ae3dabfc00751042cb1e616e9b Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.801073 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.806215 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m95jr"] Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.823593 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.854603 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.895218 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-t9l7c"] Sep 30 03:11:16 crc kubenswrapper[4744]: W0930 03:11:16.902519 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8f29c8c_e61d_4ec5_8a7c_c3c9079bb064.slice/crio-6aac3fce21b24af9b2451b82f459ac0418fcfb49f8bffd4d89393e1756b905bd WatchSource:0}: Error finding container 6aac3fce21b24af9b2451b82f459ac0418fcfb49f8bffd4d89393e1756b905bd: Status 404 returned error can't find the container with id 6aac3fce21b24af9b2451b82f459ac0418fcfb49f8bffd4d89393e1756b905bd Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.945251 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-config\") pod \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.945342 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nl7b\" (UniqueName: \"kubernetes.io/projected/14f44152-2c4d-4b63-9bac-0b626ed31685-kube-api-access-9nl7b\") pod \"14f44152-2c4d-4b63-9bac-0b626ed31685\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.945426 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-dns-svc\") pod \"14f44152-2c4d-4b63-9bac-0b626ed31685\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.945473 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-config\") pod \"14f44152-2c4d-4b63-9bac-0b626ed31685\" (UID: \"14f44152-2c4d-4b63-9bac-0b626ed31685\") " Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.945503 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk96t\" (UniqueName: \"kubernetes.io/projected/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-kube-api-access-sk96t\") pod \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\" (UID: \"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e\") " Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.947523 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-config" (OuterVolumeSpecName: "config") pod "14f44152-2c4d-4b63-9bac-0b626ed31685" (UID: "14f44152-2c4d-4b63-9bac-0b626ed31685"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.947602 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14f44152-2c4d-4b63-9bac-0b626ed31685" (UID: "14f44152-2c4d-4b63-9bac-0b626ed31685"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.948616 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-config" (OuterVolumeSpecName: "config") pod "c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e" (UID: "c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.952585 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-kube-api-access-sk96t" (OuterVolumeSpecName: "kube-api-access-sk96t") pod "c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e" (UID: "c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e"). InnerVolumeSpecName "kube-api-access-sk96t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.952713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f44152-2c4d-4b63-9bac-0b626ed31685-kube-api-access-9nl7b" (OuterVolumeSpecName: "kube-api-access-9nl7b") pod "14f44152-2c4d-4b63-9bac-0b626ed31685" (UID: "14f44152-2c4d-4b63-9bac-0b626ed31685"). InnerVolumeSpecName "kube-api-access-9nl7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:16 crc kubenswrapper[4744]: I0930 03:11:16.996495 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 03:11:17 crc kubenswrapper[4744]: W0930 03:11:17.001369 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e0e55f0_f333_4bc6_9905_18adf601fb9c.slice/crio-c03a620137c1f7554fd9f3654eea75d62af4446791b489f37f8fcd55f18c0fef WatchSource:0}: Error finding container c03a620137c1f7554fd9f3654eea75d62af4446791b489f37f8fcd55f18c0fef: Status 404 returned error can't find the container with id c03a620137c1f7554fd9f3654eea75d62af4446791b489f37f8fcd55f18c0fef Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.048020 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.048045 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f44152-2c4d-4b63-9bac-0b626ed31685-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.048054 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk96t\" (UniqueName: \"kubernetes.io/projected/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-kube-api-access-sk96t\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.048066 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.048076 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nl7b\" (UniqueName: \"kubernetes.io/projected/14f44152-2c4d-4b63-9bac-0b626ed31685-kube-api-access-9nl7b\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.145996 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t9l7c" event={"ID":"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064","Type":"ContainerStarted","Data":"6aac3fce21b24af9b2451b82f459ac0418fcfb49f8bffd4d89393e1756b905bd"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.147534 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr" event={"ID":"6aa7757e-eced-4195-8b1d-88fd7a3b322d","Type":"ContainerStarted","Data":"a577c8b1b67d2230a257eedbd8bc1bb683e433c25eea8344098c1be498d986bf"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.148589 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0eb33cf6-e46d-4f10-b794-6707d21fc4ab","Type":"ContainerStarted","Data":"c500103bd6935d573114a32f54626837d69ccf7c83541df4553360a4242ae29e"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.150227 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1e0e55f0-f333-4bc6-9905-18adf601fb9c","Type":"ContainerStarted","Data":"c03a620137c1f7554fd9f3654eea75d62af4446791b489f37f8fcd55f18c0fef"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.153020 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" event={"ID":"a9f869fc-4cf8-45c9-9eec-f2756338f2d9","Type":"ContainerStarted","Data":"d35f650e93f4d0a82badcd2cc8fabdf7f9cc9941b77275dfce70d6830b014ba0"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.153149 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.154155 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" event={"ID":"c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e","Type":"ContainerDied","Data":"9b7e9815e4d6f2489b400e554239bc8221512e958650dd3f04be51402ad2c88c"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.154196 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fknzx" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.157036 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" event={"ID":"ea6885ed-b36b-4593-9cb2-43f9c1cef850","Type":"ContainerStarted","Data":"41c51d2eb48dbaa768708c1aae2188d00739a8be85e1b68c90c8732ab80bcf18"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.157106 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.157820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d2d0096-8154-4723-aa53-80eaeb9e4d32","Type":"ContainerStarted","Data":"2cc09e64b0a3ccbdde1191ac21f5d7f6462260e6dad9c19bcb2bdac313ff7b62"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.159080 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" event={"ID":"14f44152-2c4d-4b63-9bac-0b626ed31685","Type":"ContainerDied","Data":"cb2bb423243347293d4e09a359b258597dfaa71a94c0f424d4311be56b4e1891"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.159173 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4d64w" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.164689 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6379476-59f6-4c51-8f3f-7ea563d15030","Type":"ContainerStarted","Data":"950e98f0ae559603c3582a46c318149c09b8522b5df26b9918b1ca516d4231df"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.171688 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c1131b4e-532d-478b-bbd8-b52963f60462","Type":"ContainerStarted","Data":"a3429e25289ebf12849c6f0d4e801952a09e7c24dd9ad9ce4deb2a311a125ee7"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.173653 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7bfc1c21-6422-4308-8370-2dd0b26a3c1e","Type":"ContainerStarted","Data":"4e559cfa17e2c6acefb5933887d3cffeecda5e870b472faf865da3ac738ef65a"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.175268 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ddf3db46-b4d2-469a-bc2e-dc5610bb2807","Type":"ContainerStarted","Data":"b275c0133f69e0d21344263051b6f2992eeba3ae3dabfc00751042cb1e616e9b"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.176368 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79aeb9a3-f29e-49f0-af59-ae29868cc21e","Type":"ContainerStarted","Data":"e092da75813a67de8bf2b242c3ec544d741c373cff63dcb25bf56cf34b2b81e5"} Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.179020 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" podStartSLOduration=3.00812779 podStartE2EDuration="15.179005332s" podCreationTimestamp="2025-09-30 03:11:02 +0000 UTC" firstStartedPulling="2025-09-30 03:11:03.574712527 +0000 UTC m=+990.747932511" lastFinishedPulling="2025-09-30 03:11:15.745590079 +0000 UTC m=+1002.918810053" observedRunningTime="2025-09-30 03:11:17.170647513 +0000 UTC m=+1004.343867487" watchObservedRunningTime="2025-09-30 03:11:17.179005332 +0000 UTC m=+1004.352225316" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.187410 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" podStartSLOduration=2.708153573 podStartE2EDuration="15.187381272s" podCreationTimestamp="2025-09-30 03:11:02 +0000 UTC" firstStartedPulling="2025-09-30 03:11:03.266193545 +0000 UTC m=+990.439413519" lastFinishedPulling="2025-09-30 03:11:15.745421244 +0000 UTC m=+1002.918641218" observedRunningTime="2025-09-30 03:11:17.184426691 +0000 UTC m=+1004.357646665" watchObservedRunningTime="2025-09-30 03:11:17.187381272 +0000 UTC m=+1004.360601246" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.222634 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4d64w"] Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.226628 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4d64w"] Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.293033 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fknzx"] Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.303860 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fknzx"] Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.515461 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f44152-2c4d-4b63-9bac-0b626ed31685" path="/var/lib/kubelet/pods/14f44152-2c4d-4b63-9bac-0b626ed31685/volumes" Sep 30 03:11:17 crc kubenswrapper[4744]: I0930 03:11:17.515880 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e" path="/var/lib/kubelet/pods/c3a5f1a2-ee44-43e4-9a5f-da28cdd3585e/volumes" Sep 30 03:11:22 crc kubenswrapper[4744]: I0930 03:11:22.929681 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:23 crc kubenswrapper[4744]: I0930 03:11:23.231561 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:23 crc kubenswrapper[4744]: I0930 03:11:23.284635 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kbhdw"] Sep 30 03:11:23 crc kubenswrapper[4744]: I0930 03:11:23.284826 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" podUID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerName="dnsmasq-dns" containerID="cri-o://41c51d2eb48dbaa768708c1aae2188d00739a8be85e1b68c90c8732ab80bcf18" gracePeriod=10 Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.227779 4744 generic.go:334] "Generic (PLEG): container finished" podID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerID="41c51d2eb48dbaa768708c1aae2188d00739a8be85e1b68c90c8732ab80bcf18" exitCode=0 Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.227855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" event={"ID":"ea6885ed-b36b-4593-9cb2-43f9c1cef850","Type":"ContainerDied","Data":"41c51d2eb48dbaa768708c1aae2188d00739a8be85e1b68c90c8732ab80bcf18"} Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.712879 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.890389 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9w4\" (UniqueName: \"kubernetes.io/projected/ea6885ed-b36b-4593-9cb2-43f9c1cef850-kube-api-access-nh9w4\") pod \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.890469 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-dns-svc\") pod \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.890575 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-config\") pod \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\" (UID: \"ea6885ed-b36b-4593-9cb2-43f9c1cef850\") " Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.894651 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6885ed-b36b-4593-9cb2-43f9c1cef850-kube-api-access-nh9w4" (OuterVolumeSpecName: "kube-api-access-nh9w4") pod "ea6885ed-b36b-4593-9cb2-43f9c1cef850" (UID: "ea6885ed-b36b-4593-9cb2-43f9c1cef850"). InnerVolumeSpecName "kube-api-access-nh9w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:24 crc kubenswrapper[4744]: I0930 03:11:24.992051 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9w4\" (UniqueName: \"kubernetes.io/projected/ea6885ed-b36b-4593-9cb2-43f9c1cef850-kube-api-access-nh9w4\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.035977 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-config" (OuterVolumeSpecName: "config") pod "ea6885ed-b36b-4593-9cb2-43f9c1cef850" (UID: "ea6885ed-b36b-4593-9cb2-43f9c1cef850"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.041135 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea6885ed-b36b-4593-9cb2-43f9c1cef850" (UID: "ea6885ed-b36b-4593-9cb2-43f9c1cef850"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.093362 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.094221 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6885ed-b36b-4593-9cb2-43f9c1cef850-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.137743 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-227s2"] Sep 30 03:11:25 crc kubenswrapper[4744]: E0930 03:11:25.138038 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerName="dnsmasq-dns" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.138057 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerName="dnsmasq-dns" Sep 30 03:11:25 crc kubenswrapper[4744]: E0930 03:11:25.138076 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerName="init" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.138083 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerName="init" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.138447 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" containerName="dnsmasq-dns" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.138921 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.140783 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.161058 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-227s2"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.237224 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0eb33cf6-e46d-4f10-b794-6707d21fc4ab","Type":"ContainerStarted","Data":"1c0cd3ddb3c72dd957533af3f8ced0d38983a6fd8c9f2d43dbc6f6d344c57a99"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.237608 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.238669 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d2d0096-8154-4723-aa53-80eaeb9e4d32","Type":"ContainerStarted","Data":"f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.240706 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79aeb9a3-f29e-49f0-af59-ae29868cc21e","Type":"ContainerStarted","Data":"b47da698457c41ad15230a0e3bf737f8c71f90153142d8f17152c9157c30ffd4"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.245558 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c1131b4e-532d-478b-bbd8-b52963f60462","Type":"ContainerStarted","Data":"dd170d1026477232f35bad5458582e51e4519e25afae81ce36f32bcaf9336a4b"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.248047 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1e0e55f0-f333-4bc6-9905-18adf601fb9c","Type":"ContainerStarted","Data":"edfcec18e3ef10528f2b6fda398c5449ded4107f7c3baddbe46e7cfb03248d8d"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.250658 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7bfc1c21-6422-4308-8370-2dd0b26a3c1e","Type":"ContainerStarted","Data":"c251efa87a257cb58f9bb14fb3a8f9ad1b65cddca6470370d3d466f189436f5c"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.252607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" event={"ID":"ea6885ed-b36b-4593-9cb2-43f9c1cef850","Type":"ContainerDied","Data":"1677a1a7309600c40d02aaeb9e2d87ebf34d7b11967f284ccedc1839335c2982"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.252650 4744 scope.go:117] "RemoveContainer" containerID="41c51d2eb48dbaa768708c1aae2188d00739a8be85e1b68c90c8732ab80bcf18" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.252753 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kbhdw" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.255925 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6379476-59f6-4c51-8f3f-7ea563d15030","Type":"ContainerStarted","Data":"ef7b9da0c8679b124b65e81d314aa715994931a172d5756599fac53a69589e6e"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.256119 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.263356 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t9l7c" event={"ID":"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064","Type":"ContainerStarted","Data":"0ed6f23698e700f4a2c591550edd3cc7f4da2ef3e63652b49fd9d441104e66d9"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.266351 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-7mmdt"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.267626 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.268128 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr" event={"ID":"6aa7757e-eced-4195-8b1d-88fd7a3b322d","Type":"ContainerStarted","Data":"e322ed9dd23055be478ee8fee85c2e5e74dfc35b9e23df9327a823f8b8ab30cb"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.268796 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-m95jr" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.269007 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.269945 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.30058165 podStartE2EDuration="18.26992882s" podCreationTimestamp="2025-09-30 03:11:07 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.679702811 +0000 UTC m=+1003.852922805" lastFinishedPulling="2025-09-30 03:11:23.649049991 +0000 UTC m=+1010.822269975" observedRunningTime="2025-09-30 03:11:25.259090565 +0000 UTC m=+1012.432310539" watchObservedRunningTime="2025-09-30 03:11:25.26992882 +0000 UTC m=+1012.443148794" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.278652 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ddf3db46-b4d2-469a-bc2e-dc5610bb2807","Type":"ContainerStarted","Data":"8915c4fd8d828a98c39531a6faa45cd42f7b9e7c3d6f5c1659517888d2596386"} Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.285918 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-7mmdt"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.287543 4744 scope.go:117] "RemoveContainer" containerID="7c27945ce2290b78b4a4f53d659de5ef55990762e9cf0a30759f73f7e0d01711" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.296603 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc48401-bc82-4227-a5f2-22b7b5699433-config\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.296661 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfc48401-bc82-4227-a5f2-22b7b5699433-ovn-rundir\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.296716 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfc48401-bc82-4227-a5f2-22b7b5699433-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.296756 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5252\" (UniqueName: \"kubernetes.io/projected/dfc48401-bc82-4227-a5f2-22b7b5699433-kube-api-access-n5252\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.296795 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc48401-bc82-4227-a5f2-22b7b5699433-combined-ca-bundle\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.296819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfc48401-bc82-4227-a5f2-22b7b5699433-ovs-rundir\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.352230 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.426218703 podStartE2EDuration="16.352213214s" podCreationTimestamp="2025-09-30 03:11:09 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.677031238 +0000 UTC m=+1003.850251212" lastFinishedPulling="2025-09-30 03:11:24.603025749 +0000 UTC m=+1011.776245723" observedRunningTime="2025-09-30 03:11:25.350986306 +0000 UTC m=+1012.524206280" watchObservedRunningTime="2025-09-30 03:11:25.352213214 +0000 UTC m=+1012.525433188" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.399682 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc48401-bc82-4227-a5f2-22b7b5699433-combined-ca-bundle\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.399787 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfc48401-bc82-4227-a5f2-22b7b5699433-ovs-rundir\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.399807 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc48401-bc82-4227-a5f2-22b7b5699433-config\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.399851 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfc48401-bc82-4227-a5f2-22b7b5699433-ovn-rundir\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.399942 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.399979 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.400018 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68hz\" (UniqueName: \"kubernetes.io/projected/95c3ff05-36a1-4a83-a73b-0a001783c9ed-kube-api-access-f68hz\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.400069 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfc48401-bc82-4227-a5f2-22b7b5699433-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.400095 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-config\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.400209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5252\" (UniqueName: \"kubernetes.io/projected/dfc48401-bc82-4227-a5f2-22b7b5699433-kube-api-access-n5252\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.401639 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfc48401-bc82-4227-a5f2-22b7b5699433-ovn-rundir\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.403098 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfc48401-bc82-4227-a5f2-22b7b5699433-ovs-rundir\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.403780 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc48401-bc82-4227-a5f2-22b7b5699433-config\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.405112 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfc48401-bc82-4227-a5f2-22b7b5699433-combined-ca-bundle\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.408686 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kbhdw"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.413914 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfc48401-bc82-4227-a5f2-22b7b5699433-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.415416 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kbhdw"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.435279 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5252\" (UniqueName: \"kubernetes.io/projected/dfc48401-bc82-4227-a5f2-22b7b5699433-kube-api-access-n5252\") pod \"ovn-controller-metrics-227s2\" (UID: \"dfc48401-bc82-4227-a5f2-22b7b5699433\") " pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.459920 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-227s2" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.461576 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m95jr" podStartSLOduration=5.350111286 podStartE2EDuration="12.461564796s" podCreationTimestamp="2025-09-30 03:11:13 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.825292578 +0000 UTC m=+1003.998512552" lastFinishedPulling="2025-09-30 03:11:23.936746078 +0000 UTC m=+1011.109966062" observedRunningTime="2025-09-30 03:11:25.432370541 +0000 UTC m=+1012.605590515" watchObservedRunningTime="2025-09-30 03:11:25.461564796 +0000 UTC m=+1012.634784760" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.502666 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.502718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68hz\" (UniqueName: \"kubernetes.io/projected/95c3ff05-36a1-4a83-a73b-0a001783c9ed-kube-api-access-f68hz\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.502774 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-config\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.502873 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.503810 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-config\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.504335 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.504670 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.515216 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6885ed-b36b-4593-9cb2-43f9c1cef850" path="/var/lib/kubelet/pods/ea6885ed-b36b-4593-9cb2-43f9c1cef850/volumes" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.528956 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68hz\" (UniqueName: \"kubernetes.io/projected/95c3ff05-36a1-4a83-a73b-0a001783c9ed-kube-api-access-f68hz\") pod \"dnsmasq-dns-7fd796d7df-7mmdt\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.582586 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-7mmdt"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.583422 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.610009 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2sk2r"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.611298 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.614270 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.646456 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2sk2r"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.716388 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.716464 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kws52\" (UniqueName: \"kubernetes.io/projected/75824f59-9f8b-46d5-ad6d-668acf23a1b2-kube-api-access-kws52\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.716493 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-config\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.716525 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.716546 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.817663 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kws52\" (UniqueName: \"kubernetes.io/projected/75824f59-9f8b-46d5-ad6d-668acf23a1b2-kube-api-access-kws52\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.817714 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-config\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.817749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.817766 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.817829 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.818780 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.818797 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-config\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.818877 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.818976 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.839151 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kws52\" (UniqueName: \"kubernetes.io/projected/75824f59-9f8b-46d5-ad6d-668acf23a1b2-kube-api-access-kws52\") pod \"dnsmasq-dns-86db49b7ff-2sk2r\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.974572 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-227s2"] Sep 30 03:11:25 crc kubenswrapper[4744]: I0930 03:11:25.977704 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:26 crc kubenswrapper[4744]: I0930 03:11:26.049817 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-7mmdt"] Sep 30 03:11:26 crc kubenswrapper[4744]: W0930 03:11:26.060574 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c3ff05_36a1_4a83_a73b_0a001783c9ed.slice/crio-0683f041c61cbee72560f213d9a7f5ead4ee6654e4a442c0f0621d3334a39db2 WatchSource:0}: Error finding container 0683f041c61cbee72560f213d9a7f5ead4ee6654e4a442c0f0621d3334a39db2: Status 404 returned error can't find the container with id 0683f041c61cbee72560f213d9a7f5ead4ee6654e4a442c0f0621d3334a39db2 Sep 30 03:11:26 crc kubenswrapper[4744]: I0930 03:11:26.296190 4744 generic.go:334] "Generic (PLEG): container finished" podID="f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064" containerID="0ed6f23698e700f4a2c591550edd3cc7f4da2ef3e63652b49fd9d441104e66d9" exitCode=0 Sep 30 03:11:26 crc kubenswrapper[4744]: I0930 03:11:26.296272 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t9l7c" event={"ID":"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064","Type":"ContainerDied","Data":"0ed6f23698e700f4a2c591550edd3cc7f4da2ef3e63652b49fd9d441104e66d9"} Sep 30 03:11:26 crc kubenswrapper[4744]: I0930 03:11:26.300747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" event={"ID":"95c3ff05-36a1-4a83-a73b-0a001783c9ed","Type":"ContainerStarted","Data":"0683f041c61cbee72560f213d9a7f5ead4ee6654e4a442c0f0621d3334a39db2"} Sep 30 03:11:26 crc kubenswrapper[4744]: I0930 03:11:26.302928 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-227s2" event={"ID":"dfc48401-bc82-4227-a5f2-22b7b5699433","Type":"ContainerStarted","Data":"5d73575fba874ff56c9c4fbcb3353f51c40edc543081b2ddea5b7eb5ae8cdfbe"} Sep 30 03:11:26 crc kubenswrapper[4744]: I0930 03:11:26.402811 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2sk2r"] Sep 30 03:11:26 crc kubenswrapper[4744]: W0930 03:11:26.435366 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75824f59_9f8b_46d5_ad6d_668acf23a1b2.slice/crio-c03f7344da0e18e5a3dd670c3017c00f91c76ba2e143a8c720cf7373def781ea WatchSource:0}: Error finding container c03f7344da0e18e5a3dd670c3017c00f91c76ba2e143a8c720cf7373def781ea: Status 404 returned error can't find the container with id c03f7344da0e18e5a3dd670c3017c00f91c76ba2e143a8c720cf7373def781ea Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.314199 4744 generic.go:334] "Generic (PLEG): container finished" podID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerID="d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5" exitCode=0 Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.314738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" event={"ID":"75824f59-9f8b-46d5-ad6d-668acf23a1b2","Type":"ContainerDied","Data":"d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5"} Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.314763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" event={"ID":"75824f59-9f8b-46d5-ad6d-668acf23a1b2","Type":"ContainerStarted","Data":"c03f7344da0e18e5a3dd670c3017c00f91c76ba2e143a8c720cf7373def781ea"} Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.320931 4744 generic.go:334] "Generic (PLEG): container finished" podID="95c3ff05-36a1-4a83-a73b-0a001783c9ed" containerID="4a53151fe776d27879d8c40f38c83752bfc6497d78d661bdfde1e7acf35682b5" exitCode=0 Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.321010 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" event={"ID":"95c3ff05-36a1-4a83-a73b-0a001783c9ed","Type":"ContainerDied","Data":"4a53151fe776d27879d8c40f38c83752bfc6497d78d661bdfde1e7acf35682b5"} Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.336957 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t9l7c" event={"ID":"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064","Type":"ContainerStarted","Data":"4895564511d05fe7ef057066b214c70cefd1e782868bc3eb692bd42a82518265"} Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.337002 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-t9l7c" event={"ID":"f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064","Type":"ContainerStarted","Data":"eb665e10310181950f6b88ca597d24011835803448191ded0971daa25a5777b2"} Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.337240 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.337275 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:27 crc kubenswrapper[4744]: I0930 03:11:27.386867 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-t9l7c" podStartSLOduration=7.527660587 podStartE2EDuration="14.38684753s" podCreationTimestamp="2025-09-30 03:11:13 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.90851143 +0000 UTC m=+1004.081731404" lastFinishedPulling="2025-09-30 03:11:23.767698373 +0000 UTC m=+1010.940918347" observedRunningTime="2025-09-30 03:11:27.376006604 +0000 UTC m=+1014.549226568" watchObservedRunningTime="2025-09-30 03:11:27.38684753 +0000 UTC m=+1014.560067514" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.284994 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.343832 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" event={"ID":"95c3ff05-36a1-4a83-a73b-0a001783c9ed","Type":"ContainerDied","Data":"0683f041c61cbee72560f213d9a7f5ead4ee6654e4a442c0f0621d3334a39db2"} Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.343896 4744 scope.go:117] "RemoveContainer" containerID="4a53151fe776d27879d8c40f38c83752bfc6497d78d661bdfde1e7acf35682b5" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.343855 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-7mmdt" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.357314 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-ovsdbserver-nb\") pod \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.357656 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-config\") pod \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.357751 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-dns-svc\") pod \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.357798 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f68hz\" (UniqueName: \"kubernetes.io/projected/95c3ff05-36a1-4a83-a73b-0a001783c9ed-kube-api-access-f68hz\") pod \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\" (UID: \"95c3ff05-36a1-4a83-a73b-0a001783c9ed\") " Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.369728 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c3ff05-36a1-4a83-a73b-0a001783c9ed-kube-api-access-f68hz" (OuterVolumeSpecName: "kube-api-access-f68hz") pod "95c3ff05-36a1-4a83-a73b-0a001783c9ed" (UID: "95c3ff05-36a1-4a83-a73b-0a001783c9ed"). InnerVolumeSpecName "kube-api-access-f68hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.380107 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-config" (OuterVolumeSpecName: "config") pod "95c3ff05-36a1-4a83-a73b-0a001783c9ed" (UID: "95c3ff05-36a1-4a83-a73b-0a001783c9ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.380593 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95c3ff05-36a1-4a83-a73b-0a001783c9ed" (UID: "95c3ff05-36a1-4a83-a73b-0a001783c9ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.390977 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95c3ff05-36a1-4a83-a73b-0a001783c9ed" (UID: "95c3ff05-36a1-4a83-a73b-0a001783c9ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.459419 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.459451 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.459461 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95c3ff05-36a1-4a83-a73b-0a001783c9ed-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.459470 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f68hz\" (UniqueName: \"kubernetes.io/projected/95c3ff05-36a1-4a83-a73b-0a001783c9ed-kube-api-access-f68hz\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.911567 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-7mmdt"] Sep 30 03:11:28 crc kubenswrapper[4744]: I0930 03:11:28.917902 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-7mmdt"] Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.356961 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" event={"ID":"75824f59-9f8b-46d5-ad6d-668acf23a1b2","Type":"ContainerStarted","Data":"fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6"} Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.357729 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.361749 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-227s2" event={"ID":"dfc48401-bc82-4227-a5f2-22b7b5699433","Type":"ContainerStarted","Data":"35d7f2a5de693c59adc05aa2a181634e7d6e647dde18a7a9ed715855c7c37405"} Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.366346 4744 generic.go:334] "Generic (PLEG): container finished" podID="c1131b4e-532d-478b-bbd8-b52963f60462" containerID="dd170d1026477232f35bad5458582e51e4519e25afae81ce36f32bcaf9336a4b" exitCode=0 Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.366508 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c1131b4e-532d-478b-bbd8-b52963f60462","Type":"ContainerDied","Data":"dd170d1026477232f35bad5458582e51e4519e25afae81ce36f32bcaf9336a4b"} Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.370835 4744 generic.go:334] "Generic (PLEG): container finished" podID="ddf3db46-b4d2-469a-bc2e-dc5610bb2807" containerID="8915c4fd8d828a98c39531a6faa45cd42f7b9e7c3d6f5c1659517888d2596386" exitCode=0 Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.370912 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ddf3db46-b4d2-469a-bc2e-dc5610bb2807","Type":"ContainerDied","Data":"8915c4fd8d828a98c39531a6faa45cd42f7b9e7c3d6f5c1659517888d2596386"} Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.379571 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1e0e55f0-f333-4bc6-9905-18adf601fb9c","Type":"ContainerStarted","Data":"4606a10548e5484c96b9a55c1d0e07a71615c6c176d97dbd42bffc3eae82ef87"} Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.382636 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7bfc1c21-6422-4308-8370-2dd0b26a3c1e","Type":"ContainerStarted","Data":"26ab4dceb98a9c00bccd2eb64b0e302d1837140e45d3700bfe421ea6fba2bf2d"} Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.394690 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.396343 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" podStartSLOduration=4.396323886 podStartE2EDuration="4.396323886s" podCreationTimestamp="2025-09-30 03:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:11:29.378673858 +0000 UTC m=+1016.551893832" watchObservedRunningTime="2025-09-30 03:11:29.396323886 +0000 UTC m=+1016.569543890" Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.499188 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-227s2" podStartSLOduration=1.699802775 podStartE2EDuration="4.499166027s" podCreationTimestamp="2025-09-30 03:11:25 +0000 UTC" firstStartedPulling="2025-09-30 03:11:25.981110796 +0000 UTC m=+1013.154330770" lastFinishedPulling="2025-09-30 03:11:28.780474028 +0000 UTC m=+1015.953694022" observedRunningTime="2025-09-30 03:11:29.490308312 +0000 UTC m=+1016.663528316" watchObservedRunningTime="2025-09-30 03:11:29.499166027 +0000 UTC m=+1016.672386011" Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.527894 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.566713701 podStartE2EDuration="15.527864387s" podCreationTimestamp="2025-09-30 03:11:14 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.827749134 +0000 UTC m=+1004.000969108" lastFinishedPulling="2025-09-30 03:11:28.78889982 +0000 UTC m=+1015.962119794" observedRunningTime="2025-09-30 03:11:29.526808904 +0000 UTC m=+1016.700028918" watchObservedRunningTime="2025-09-30 03:11:29.527864387 +0000 UTC m=+1016.701084401" Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.536888 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c3ff05-36a1-4a83-a73b-0a001783c9ed" path="/var/lib/kubelet/pods/95c3ff05-36a1-4a83-a73b-0a001783c9ed/volumes" Sep 30 03:11:29 crc kubenswrapper[4744]: I0930 03:11:29.562564 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.794217311 podStartE2EDuration="15.562545654s" podCreationTimestamp="2025-09-30 03:11:14 +0000 UTC" firstStartedPulling="2025-09-30 03:11:17.004332363 +0000 UTC m=+1004.177552337" lastFinishedPulling="2025-09-30 03:11:28.772660696 +0000 UTC m=+1015.945880680" observedRunningTime="2025-09-30 03:11:29.558697723 +0000 UTC m=+1016.731917737" watchObservedRunningTime="2025-09-30 03:11:29.562545654 +0000 UTC m=+1016.735765628" Sep 30 03:11:30 crc kubenswrapper[4744]: I0930 03:11:30.399896 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c1131b4e-532d-478b-bbd8-b52963f60462","Type":"ContainerStarted","Data":"c6410bf350ca6e6e4d9ba89713f915400c1269bda8e6516fbdc46e2a0b2c3c6d"} Sep 30 03:11:30 crc kubenswrapper[4744]: I0930 03:11:30.405286 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ddf3db46-b4d2-469a-bc2e-dc5610bb2807","Type":"ContainerStarted","Data":"39c3251e5350c2df85cf56876cbab85fdf562cc3e1144ceb7a3cbaeb9a046501"} Sep 30 03:11:30 crc kubenswrapper[4744]: I0930 03:11:30.436971 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.925842483 podStartE2EDuration="24.436954212s" podCreationTimestamp="2025-09-30 03:11:06 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.256546882 +0000 UTC m=+1003.429766856" lastFinishedPulling="2025-09-30 03:11:23.767658621 +0000 UTC m=+1010.940878585" observedRunningTime="2025-09-30 03:11:30.430142011 +0000 UTC m=+1017.603361985" watchObservedRunningTime="2025-09-30 03:11:30.436954212 +0000 UTC m=+1017.610174186" Sep 30 03:11:30 crc kubenswrapper[4744]: I0930 03:11:30.460195 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=17.634870176 podStartE2EDuration="25.460169653s" podCreationTimestamp="2025-09-30 03:11:05 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.680025471 +0000 UTC m=+1003.853245455" lastFinishedPulling="2025-09-30 03:11:24.505324958 +0000 UTC m=+1011.678544932" observedRunningTime="2025-09-30 03:11:30.459524253 +0000 UTC m=+1017.632744317" watchObservedRunningTime="2025-09-30 03:11:30.460169653 +0000 UTC m=+1017.633389667" Sep 30 03:11:30 crc kubenswrapper[4744]: I0930 03:11:30.904294 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:30 crc kubenswrapper[4744]: I0930 03:11:30.904347 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:30 crc kubenswrapper[4744]: I0930 03:11:30.957114 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.189869 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.190269 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.252140 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.473968 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.475141 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.817685 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 03:11:31 crc kubenswrapper[4744]: E0930 03:11:31.818076 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c3ff05-36a1-4a83-a73b-0a001783c9ed" containerName="init" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.818095 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c3ff05-36a1-4a83-a73b-0a001783c9ed" containerName="init" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.818305 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c3ff05-36a1-4a83-a73b-0a001783c9ed" containerName="init" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.819109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.821274 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.822034 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.823247 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-dm696" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.823432 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.835500 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.919437 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09a4d14b-16a0-442c-8444-af404618ae96-scripts\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.919510 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.919586 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a4d14b-16a0-442c-8444-af404618ae96-config\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.919621 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws8hf\" (UniqueName: \"kubernetes.io/projected/09a4d14b-16a0-442c-8444-af404618ae96-kube-api-access-ws8hf\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.919675 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09a4d14b-16a0-442c-8444-af404618ae96-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.919844 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:31 crc kubenswrapper[4744]: I0930 03:11:31.920002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.021575 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.021641 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a4d14b-16a0-442c-8444-af404618ae96-config\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.021681 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws8hf\" (UniqueName: \"kubernetes.io/projected/09a4d14b-16a0-442c-8444-af404618ae96-kube-api-access-ws8hf\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.021741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09a4d14b-16a0-442c-8444-af404618ae96-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.021793 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.021846 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.021869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09a4d14b-16a0-442c-8444-af404618ae96-scripts\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.023292 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a4d14b-16a0-442c-8444-af404618ae96-config\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.023350 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09a4d14b-16a0-442c-8444-af404618ae96-scripts\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.026051 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09a4d14b-16a0-442c-8444-af404618ae96-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.028983 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.029846 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.040308 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a4d14b-16a0-442c-8444-af404618ae96-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.044747 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws8hf\" (UniqueName: \"kubernetes.io/projected/09a4d14b-16a0-442c-8444-af404618ae96-kube-api-access-ws8hf\") pod \"ovn-northd-0\" (UID: \"09a4d14b-16a0-442c-8444-af404618ae96\") " pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.136974 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.630846 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 03:11:32 crc kubenswrapper[4744]: I0930 03:11:32.636471 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 03:11:32 crc kubenswrapper[4744]: W0930 03:11:32.641579 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a4d14b_16a0_442c_8444_af404618ae96.slice/crio-03f2ce6990cd346e2a3bd2eb9709f9bba5b6486509f902d7c4013e7016afb85c WatchSource:0}: Error finding container 03f2ce6990cd346e2a3bd2eb9709f9bba5b6486509f902d7c4013e7016afb85c: Status 404 returned error can't find the container with id 03f2ce6990cd346e2a3bd2eb9709f9bba5b6486509f902d7c4013e7016afb85c Sep 30 03:11:33 crc kubenswrapper[4744]: I0930 03:11:33.433221 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"09a4d14b-16a0-442c-8444-af404618ae96","Type":"ContainerStarted","Data":"03f2ce6990cd346e2a3bd2eb9709f9bba5b6486509f902d7c4013e7016afb85c"} Sep 30 03:11:35 crc kubenswrapper[4744]: I0930 03:11:35.979791 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:36 crc kubenswrapper[4744]: I0930 03:11:36.043839 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2wcjr"] Sep 30 03:11:36 crc kubenswrapper[4744]: I0930 03:11:36.044087 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" podUID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerName="dnsmasq-dns" containerID="cri-o://d35f650e93f4d0a82badcd2cc8fabdf7f9cc9941b77275dfce70d6830b014ba0" gracePeriod=10 Sep 30 03:11:36 crc kubenswrapper[4744]: I0930 03:11:36.461403 4744 generic.go:334] "Generic (PLEG): container finished" podID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerID="d35f650e93f4d0a82badcd2cc8fabdf7f9cc9941b77275dfce70d6830b014ba0" exitCode=0 Sep 30 03:11:36 crc kubenswrapper[4744]: I0930 03:11:36.461431 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" event={"ID":"a9f869fc-4cf8-45c9-9eec-f2756338f2d9","Type":"ContainerDied","Data":"d35f650e93f4d0a82badcd2cc8fabdf7f9cc9941b77275dfce70d6830b014ba0"} Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.311955 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.311998 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.367754 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.421783 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.422176 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.471708 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" event={"ID":"a9f869fc-4cf8-45c9-9eec-f2756338f2d9","Type":"ContainerDied","Data":"6247b7a9f36e05b6dad1f9cb305eed1f647e43490762d6acebaf8ae23db7172f"} Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.471753 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6247b7a9f36e05b6dad1f9cb305eed1f647e43490762d6acebaf8ae23db7172f" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.527917 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.571211 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.650764 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-config\") pod \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.650846 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-dns-svc\") pod \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.650968 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdwz9\" (UniqueName: \"kubernetes.io/projected/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-kube-api-access-rdwz9\") pod \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\" (UID: \"a9f869fc-4cf8-45c9-9eec-f2756338f2d9\") " Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.654331 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-kube-api-access-rdwz9" (OuterVolumeSpecName: "kube-api-access-rdwz9") pod "a9f869fc-4cf8-45c9-9eec-f2756338f2d9" (UID: "a9f869fc-4cf8-45c9-9eec-f2756338f2d9"). InnerVolumeSpecName "kube-api-access-rdwz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.702779 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-config" (OuterVolumeSpecName: "config") pod "a9f869fc-4cf8-45c9-9eec-f2756338f2d9" (UID: "a9f869fc-4cf8-45c9-9eec-f2756338f2d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.706646 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9f869fc-4cf8-45c9-9eec-f2756338f2d9" (UID: "a9f869fc-4cf8-45c9-9eec-f2756338f2d9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.752747 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.752788 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdwz9\" (UniqueName: \"kubernetes.io/projected/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-kube-api-access-rdwz9\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:37 crc kubenswrapper[4744]: I0930 03:11:37.752802 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9f869fc-4cf8-45c9-9eec-f2756338f2d9-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:38 crc kubenswrapper[4744]: I0930 03:11:38.482967 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"09a4d14b-16a0-442c-8444-af404618ae96","Type":"ContainerStarted","Data":"3f3855c951c3648fcbf14fb30cd725d7b20c9796ee7ab7accd2256764b135a53"} Sep 30 03:11:38 crc kubenswrapper[4744]: I0930 03:11:38.483038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"09a4d14b-16a0-442c-8444-af404618ae96","Type":"ContainerStarted","Data":"fc7d5ed137a62080dd31551f8b17457096d0dcfa459b02fe38ea97f0a53a4df3"} Sep 30 03:11:38 crc kubenswrapper[4744]: I0930 03:11:38.482999 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2wcjr" Sep 30 03:11:38 crc kubenswrapper[4744]: I0930 03:11:38.483472 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 03:11:38 crc kubenswrapper[4744]: I0930 03:11:38.521352 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.673379095 podStartE2EDuration="7.521323237s" podCreationTimestamp="2025-09-30 03:11:31 +0000 UTC" firstStartedPulling="2025-09-30 03:11:32.643883565 +0000 UTC m=+1019.817103539" lastFinishedPulling="2025-09-30 03:11:37.491827696 +0000 UTC m=+1024.665047681" observedRunningTime="2025-09-30 03:11:38.517683794 +0000 UTC m=+1025.690903808" watchObservedRunningTime="2025-09-30 03:11:38.521323237 +0000 UTC m=+1025.694543251" Sep 30 03:11:38 crc kubenswrapper[4744]: I0930 03:11:38.572190 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2wcjr"] Sep 30 03:11:38 crc kubenswrapper[4744]: I0930 03:11:38.590614 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2wcjr"] Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.427695 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-4zsf7"] Sep 30 03:11:39 crc kubenswrapper[4744]: E0930 03:11:39.428665 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerName="dnsmasq-dns" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.428773 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerName="dnsmasq-dns" Sep 30 03:11:39 crc kubenswrapper[4744]: E0930 03:11:39.428859 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerName="init" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.428937 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerName="init" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.429183 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" containerName="dnsmasq-dns" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.430297 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.450983 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4zsf7"] Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.482347 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk8z4\" (UniqueName: \"kubernetes.io/projected/6d287118-3aa1-4314-b9fa-6bc54a28878a-kube-api-access-rk8z4\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.482692 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-dns-svc\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.482882 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.483102 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-config\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.485435 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.519137 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f869fc-4cf8-45c9-9eec-f2756338f2d9" path="/var/lib/kubelet/pods/a9f869fc-4cf8-45c9-9eec-f2756338f2d9/volumes" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.589146 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk8z4\" (UniqueName: \"kubernetes.io/projected/6d287118-3aa1-4314-b9fa-6bc54a28878a-kube-api-access-rk8z4\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.589450 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-dns-svc\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.589609 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.589743 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-config\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.589836 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.591777 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-dns-svc\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.594151 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.607850 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-config\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.608746 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.657605 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk8z4\" (UniqueName: \"kubernetes.io/projected/6d287118-3aa1-4314-b9fa-6bc54a28878a-kube-api-access-rk8z4\") pod \"dnsmasq-dns-698758b865-4zsf7\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.712996 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.755191 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:39 crc kubenswrapper[4744]: I0930 03:11:39.798280 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.205933 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4zsf7"] Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.499115 4744 generic.go:334] "Generic (PLEG): container finished" podID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerID="02a38a0031bd8a635ab3a51afe3610006095513234954d80c1e338e169d92cc0" exitCode=0 Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.499188 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4zsf7" event={"ID":"6d287118-3aa1-4314-b9fa-6bc54a28878a","Type":"ContainerDied","Data":"02a38a0031bd8a635ab3a51afe3610006095513234954d80c1e338e169d92cc0"} Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.499254 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4zsf7" event={"ID":"6d287118-3aa1-4314-b9fa-6bc54a28878a","Type":"ContainerStarted","Data":"083b559b629e906aeafdc72fd696b02b6532038570427b7cf1c3dbf4d7f4e1a9"} Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.675644 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.680672 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.683599 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.683607 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.684356 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.684435 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-kps5n" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.684490 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.807419 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.807458 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6391d7af-84fd-42ee-ac86-399fa13725de-lock\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.807533 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.807551 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6391d7af-84fd-42ee-ac86-399fa13725de-cache\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.807706 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smcrk\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-kube-api-access-smcrk\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.909979 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.910072 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6391d7af-84fd-42ee-ac86-399fa13725de-cache\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.910167 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smcrk\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-kube-api-access-smcrk\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: E0930 03:11:40.910193 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 03:11:40 crc kubenswrapper[4744]: E0930 03:11:40.910221 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 03:11:40 crc kubenswrapper[4744]: E0930 03:11:40.910286 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift podName:6391d7af-84fd-42ee-ac86-399fa13725de nodeName:}" failed. No retries permitted until 2025-09-30 03:11:41.410266476 +0000 UTC m=+1028.583486470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift") pod "swift-storage-0" (UID: "6391d7af-84fd-42ee-ac86-399fa13725de") : configmap "swift-ring-files" not found Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.910318 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6391d7af-84fd-42ee-ac86-399fa13725de-lock\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.910413 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.910675 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6391d7af-84fd-42ee-ac86-399fa13725de-cache\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.910892 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.911144 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6391d7af-84fd-42ee-ac86-399fa13725de-lock\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.931511 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smcrk\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-kube-api-access-smcrk\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:40 crc kubenswrapper[4744]: I0930 03:11:40.934219 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.200943 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qcpwz"] Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.208417 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.212357 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.212521 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.212632 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.225711 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qcpwz"] Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.321511 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-combined-ca-bundle\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.321606 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-scripts\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.321653 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-swiftconf\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.321704 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-dispersionconf\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.321755 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgj5n\" (UniqueName: \"kubernetes.io/projected/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-kube-api-access-dgj5n\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.321796 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-ring-data-devices\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.321827 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-etc-swift\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423046 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-scripts\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423183 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-swiftconf\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423303 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-dispersionconf\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423451 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgj5n\" (UniqueName: \"kubernetes.io/projected/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-kube-api-access-dgj5n\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423538 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-ring-data-devices\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423590 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-etc-swift\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.423733 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-combined-ca-bundle\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.424306 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-scripts\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.425865 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-ring-data-devices\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: E0930 03:11:41.425971 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 03:11:41 crc kubenswrapper[4744]: E0930 03:11:41.425985 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 03:11:41 crc kubenswrapper[4744]: E0930 03:11:41.426023 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift podName:6391d7af-84fd-42ee-ac86-399fa13725de nodeName:}" failed. No retries permitted until 2025-09-30 03:11:42.426010368 +0000 UTC m=+1029.599230342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift") pod "swift-storage-0" (UID: "6391d7af-84fd-42ee-ac86-399fa13725de") : configmap "swift-ring-files" not found Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.426608 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-etc-swift\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.431530 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-swiftconf\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.433078 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-combined-ca-bundle\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.443322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-dispersionconf\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.449432 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgj5n\" (UniqueName: \"kubernetes.io/projected/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-kube-api-access-dgj5n\") pod \"swift-ring-rebalance-qcpwz\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.518632 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.518662 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4zsf7" event={"ID":"6d287118-3aa1-4314-b9fa-6bc54a28878a","Type":"ContainerStarted","Data":"57b925659a328f586a63e0901e2675c3ffb7a0c1ff15b9be4f3068b1e05c3ee0"} Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.530236 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:41 crc kubenswrapper[4744]: I0930 03:11:41.535912 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-4zsf7" podStartSLOduration=2.535880606 podStartE2EDuration="2.535880606s" podCreationTimestamp="2025-09-30 03:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:11:41.529936972 +0000 UTC m=+1028.703156936" watchObservedRunningTime="2025-09-30 03:11:41.535880606 +0000 UTC m=+1028.709100620" Sep 30 03:11:42 crc kubenswrapper[4744]: I0930 03:11:42.050279 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qcpwz"] Sep 30 03:11:42 crc kubenswrapper[4744]: W0930 03:11:42.057746 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a9b94f_3d1b_40a3_9bcf_279d796e86d9.slice/crio-c03057c01c6beb654ceabc5f8e6cf5d73eff25a23ffbde009d81be6cdbce6e39 WatchSource:0}: Error finding container c03057c01c6beb654ceabc5f8e6cf5d73eff25a23ffbde009d81be6cdbce6e39: Status 404 returned error can't find the container with id c03057c01c6beb654ceabc5f8e6cf5d73eff25a23ffbde009d81be6cdbce6e39 Sep 30 03:11:42 crc kubenswrapper[4744]: I0930 03:11:42.446686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:42 crc kubenswrapper[4744]: E0930 03:11:42.446872 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 03:11:42 crc kubenswrapper[4744]: E0930 03:11:42.447023 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 03:11:42 crc kubenswrapper[4744]: E0930 03:11:42.447102 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift podName:6391d7af-84fd-42ee-ac86-399fa13725de nodeName:}" failed. No retries permitted until 2025-09-30 03:11:44.447075618 +0000 UTC m=+1031.620295632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift") pod "swift-storage-0" (UID: "6391d7af-84fd-42ee-ac86-399fa13725de") : configmap "swift-ring-files" not found Sep 30 03:11:42 crc kubenswrapper[4744]: I0930 03:11:42.524297 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qcpwz" event={"ID":"35a9b94f-3d1b-40a3-9bcf-279d796e86d9","Type":"ContainerStarted","Data":"c03057c01c6beb654ceabc5f8e6cf5d73eff25a23ffbde009d81be6cdbce6e39"} Sep 30 03:11:42 crc kubenswrapper[4744]: I0930 03:11:42.900986 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n78cc"] Sep 30 03:11:42 crc kubenswrapper[4744]: I0930 03:11:42.902101 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n78cc" Sep 30 03:11:42 crc kubenswrapper[4744]: I0930 03:11:42.911239 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n78cc"] Sep 30 03:11:42 crc kubenswrapper[4744]: I0930 03:11:42.957399 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjjw\" (UniqueName: \"kubernetes.io/projected/455eafbf-df1e-4746-ad86-5959ce329b4e-kube-api-access-cbjjw\") pod \"glance-db-create-n78cc\" (UID: \"455eafbf-df1e-4746-ad86-5959ce329b4e\") " pod="openstack/glance-db-create-n78cc" Sep 30 03:11:43 crc kubenswrapper[4744]: I0930 03:11:43.059043 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjjw\" (UniqueName: \"kubernetes.io/projected/455eafbf-df1e-4746-ad86-5959ce329b4e-kube-api-access-cbjjw\") pod \"glance-db-create-n78cc\" (UID: \"455eafbf-df1e-4746-ad86-5959ce329b4e\") " pod="openstack/glance-db-create-n78cc" Sep 30 03:11:43 crc kubenswrapper[4744]: I0930 03:11:43.082746 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjjw\" (UniqueName: \"kubernetes.io/projected/455eafbf-df1e-4746-ad86-5959ce329b4e-kube-api-access-cbjjw\") pod \"glance-db-create-n78cc\" (UID: \"455eafbf-df1e-4746-ad86-5959ce329b4e\") " pod="openstack/glance-db-create-n78cc" Sep 30 03:11:43 crc kubenswrapper[4744]: I0930 03:11:43.250558 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n78cc" Sep 30 03:11:44 crc kubenswrapper[4744]: I0930 03:11:44.484114 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:44 crc kubenswrapper[4744]: E0930 03:11:44.484438 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 03:11:44 crc kubenswrapper[4744]: E0930 03:11:44.484463 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 03:11:44 crc kubenswrapper[4744]: E0930 03:11:44.484526 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift podName:6391d7af-84fd-42ee-ac86-399fa13725de nodeName:}" failed. No retries permitted until 2025-09-30 03:11:48.484501891 +0000 UTC m=+1035.657721895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift") pod "swift-storage-0" (UID: "6391d7af-84fd-42ee-ac86-399fa13725de") : configmap "swift-ring-files" not found Sep 30 03:11:45 crc kubenswrapper[4744]: I0930 03:11:45.477005 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n78cc"] Sep 30 03:11:45 crc kubenswrapper[4744]: W0930 03:11:45.486798 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455eafbf_df1e_4746_ad86_5959ce329b4e.slice/crio-be793369859356bf217bcfe7a107bbbd21ceb7166015291fdff1b5bf8596f9e9 WatchSource:0}: Error finding container be793369859356bf217bcfe7a107bbbd21ceb7166015291fdff1b5bf8596f9e9: Status 404 returned error can't find the container with id be793369859356bf217bcfe7a107bbbd21ceb7166015291fdff1b5bf8596f9e9 Sep 30 03:11:45 crc kubenswrapper[4744]: I0930 03:11:45.557219 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qcpwz" event={"ID":"35a9b94f-3d1b-40a3-9bcf-279d796e86d9","Type":"ContainerStarted","Data":"d786b8203ed48954b3dfe359d2019131f5c49db5b816b0dad20e1ab82c6e8753"} Sep 30 03:11:45 crc kubenswrapper[4744]: I0930 03:11:45.562669 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n78cc" event={"ID":"455eafbf-df1e-4746-ad86-5959ce329b4e","Type":"ContainerStarted","Data":"be793369859356bf217bcfe7a107bbbd21ceb7166015291fdff1b5bf8596f9e9"} Sep 30 03:11:45 crc kubenswrapper[4744]: I0930 03:11:45.591919 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qcpwz" podStartSLOduration=1.5981225129999999 podStartE2EDuration="4.591901318s" podCreationTimestamp="2025-09-30 03:11:41 +0000 UTC" firstStartedPulling="2025-09-30 03:11:42.061190165 +0000 UTC m=+1029.234410139" lastFinishedPulling="2025-09-30 03:11:45.05496893 +0000 UTC m=+1032.228188944" observedRunningTime="2025-09-30 03:11:45.589871745 +0000 UTC m=+1032.763091749" watchObservedRunningTime="2025-09-30 03:11:45.591901318 +0000 UTC m=+1032.765121302" Sep 30 03:11:46 crc kubenswrapper[4744]: I0930 03:11:46.575794 4744 generic.go:334] "Generic (PLEG): container finished" podID="455eafbf-df1e-4746-ad86-5959ce329b4e" containerID="c7d5914b4bcc66674797ac0ebbd0c2eb12dd3723f702c9a471449e9418d173e7" exitCode=0 Sep 30 03:11:46 crc kubenswrapper[4744]: I0930 03:11:46.575956 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n78cc" event={"ID":"455eafbf-df1e-4746-ad86-5959ce329b4e","Type":"ContainerDied","Data":"c7d5914b4bcc66674797ac0ebbd0c2eb12dd3723f702c9a471449e9418d173e7"} Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.205165 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-c75vh"] Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.207768 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c75vh" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.215062 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c75vh"] Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.233022 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2j85\" (UniqueName: \"kubernetes.io/projected/e1a89fd1-cf25-4278-974d-e3d51a3ee539-kube-api-access-t2j85\") pod \"keystone-db-create-c75vh\" (UID: \"e1a89fd1-cf25-4278-974d-e3d51a3ee539\") " pod="openstack/keystone-db-create-c75vh" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.252974 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.335615 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2j85\" (UniqueName: \"kubernetes.io/projected/e1a89fd1-cf25-4278-974d-e3d51a3ee539-kube-api-access-t2j85\") pod \"keystone-db-create-c75vh\" (UID: \"e1a89fd1-cf25-4278-974d-e3d51a3ee539\") " pod="openstack/keystone-db-create-c75vh" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.382244 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2j85\" (UniqueName: \"kubernetes.io/projected/e1a89fd1-cf25-4278-974d-e3d51a3ee539-kube-api-access-t2j85\") pod \"keystone-db-create-c75vh\" (UID: \"e1a89fd1-cf25-4278-974d-e3d51a3ee539\") " pod="openstack/keystone-db-create-c75vh" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.478746 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hnh57"] Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.480493 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hnh57" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.531570 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hnh57"] Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.546883 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c75vh" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.550848 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttn9b\" (UniqueName: \"kubernetes.io/projected/de338fd0-f1cc-4fcb-b690-6777c2da57ce-kube-api-access-ttn9b\") pod \"placement-db-create-hnh57\" (UID: \"de338fd0-f1cc-4fcb-b690-6777c2da57ce\") " pod="openstack/placement-db-create-hnh57" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.651934 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttn9b\" (UniqueName: \"kubernetes.io/projected/de338fd0-f1cc-4fcb-b690-6777c2da57ce-kube-api-access-ttn9b\") pod \"placement-db-create-hnh57\" (UID: \"de338fd0-f1cc-4fcb-b690-6777c2da57ce\") " pod="openstack/placement-db-create-hnh57" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.675615 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttn9b\" (UniqueName: \"kubernetes.io/projected/de338fd0-f1cc-4fcb-b690-6777c2da57ce-kube-api-access-ttn9b\") pod \"placement-db-create-hnh57\" (UID: \"de338fd0-f1cc-4fcb-b690-6777c2da57ce\") " pod="openstack/placement-db-create-hnh57" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.849769 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hnh57" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.900764 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n78cc" Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.954304 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjjw\" (UniqueName: \"kubernetes.io/projected/455eafbf-df1e-4746-ad86-5959ce329b4e-kube-api-access-cbjjw\") pod \"455eafbf-df1e-4746-ad86-5959ce329b4e\" (UID: \"455eafbf-df1e-4746-ad86-5959ce329b4e\") " Sep 30 03:11:47 crc kubenswrapper[4744]: I0930 03:11:47.957962 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455eafbf-df1e-4746-ad86-5959ce329b4e-kube-api-access-cbjjw" (OuterVolumeSpecName: "kube-api-access-cbjjw") pod "455eafbf-df1e-4746-ad86-5959ce329b4e" (UID: "455eafbf-df1e-4746-ad86-5959ce329b4e"). InnerVolumeSpecName "kube-api-access-cbjjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.056414 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjjw\" (UniqueName: \"kubernetes.io/projected/455eafbf-df1e-4746-ad86-5959ce329b4e-kube-api-access-cbjjw\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:48 crc kubenswrapper[4744]: W0930 03:11:48.064887 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a89fd1_cf25_4278_974d_e3d51a3ee539.slice/crio-51f22522cda446ac0a761ea1ac2f4fd1e4711cc526139465aaa4179722394fce WatchSource:0}: Error finding container 51f22522cda446ac0a761ea1ac2f4fd1e4711cc526139465aaa4179722394fce: Status 404 returned error can't find the container with id 51f22522cda446ac0a761ea1ac2f4fd1e4711cc526139465aaa4179722394fce Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.067405 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c75vh"] Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.309694 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hnh57"] Sep 30 03:11:48 crc kubenswrapper[4744]: W0930 03:11:48.342544 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde338fd0_f1cc_4fcb_b690_6777c2da57ce.slice/crio-e3ee3213f0d6b160718ab7a7a5edcd98b7606822481396e66c04c0ae2aeec922 WatchSource:0}: Error finding container e3ee3213f0d6b160718ab7a7a5edcd98b7606822481396e66c04c0ae2aeec922: Status 404 returned error can't find the container with id e3ee3213f0d6b160718ab7a7a5edcd98b7606822481396e66c04c0ae2aeec922 Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.566854 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:48 crc kubenswrapper[4744]: E0930 03:11:48.567136 4744 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 03:11:48 crc kubenswrapper[4744]: E0930 03:11:48.567161 4744 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 03:11:48 crc kubenswrapper[4744]: E0930 03:11:48.567234 4744 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift podName:6391d7af-84fd-42ee-ac86-399fa13725de nodeName:}" failed. No retries permitted until 2025-09-30 03:11:56.56721103 +0000 UTC m=+1043.740431024 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift") pod "swift-storage-0" (UID: "6391d7af-84fd-42ee-ac86-399fa13725de") : configmap "swift-ring-files" not found Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.595544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hnh57" event={"ID":"de338fd0-f1cc-4fcb-b690-6777c2da57ce","Type":"ContainerStarted","Data":"dfc5c8fdd01c4a3b506511acb2b5f895565ef84d62502bcdfce502fe2d470c12"} Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.595646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hnh57" event={"ID":"de338fd0-f1cc-4fcb-b690-6777c2da57ce","Type":"ContainerStarted","Data":"e3ee3213f0d6b160718ab7a7a5edcd98b7606822481396e66c04c0ae2aeec922"} Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.597920 4744 generic.go:334] "Generic (PLEG): container finished" podID="e1a89fd1-cf25-4278-974d-e3d51a3ee539" containerID="fcd7868f22ee9825d41fb46adb29a09efced44d28c972d90e6e7424e3add8b51" exitCode=0 Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.598023 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c75vh" event={"ID":"e1a89fd1-cf25-4278-974d-e3d51a3ee539","Type":"ContainerDied","Data":"fcd7868f22ee9825d41fb46adb29a09efced44d28c972d90e6e7424e3add8b51"} Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.598062 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c75vh" event={"ID":"e1a89fd1-cf25-4278-974d-e3d51a3ee539","Type":"ContainerStarted","Data":"51f22522cda446ac0a761ea1ac2f4fd1e4711cc526139465aaa4179722394fce"} Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.606728 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n78cc" event={"ID":"455eafbf-df1e-4746-ad86-5959ce329b4e","Type":"ContainerDied","Data":"be793369859356bf217bcfe7a107bbbd21ceb7166015291fdff1b5bf8596f9e9"} Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.606786 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be793369859356bf217bcfe7a107bbbd21ceb7166015291fdff1b5bf8596f9e9" Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.606893 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n78cc" Sep 30 03:11:48 crc kubenswrapper[4744]: I0930 03:11:48.643790 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-hnh57" podStartSLOduration=1.643764875 podStartE2EDuration="1.643764875s" podCreationTimestamp="2025-09-30 03:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:11:48.625568351 +0000 UTC m=+1035.798788365" watchObservedRunningTime="2025-09-30 03:11:48.643764875 +0000 UTC m=+1035.816984849" Sep 30 03:11:49 crc kubenswrapper[4744]: I0930 03:11:49.633522 4744 generic.go:334] "Generic (PLEG): container finished" podID="de338fd0-f1cc-4fcb-b690-6777c2da57ce" containerID="dfc5c8fdd01c4a3b506511acb2b5f895565ef84d62502bcdfce502fe2d470c12" exitCode=0 Sep 30 03:11:49 crc kubenswrapper[4744]: I0930 03:11:49.634162 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hnh57" event={"ID":"de338fd0-f1cc-4fcb-b690-6777c2da57ce","Type":"ContainerDied","Data":"dfc5c8fdd01c4a3b506511acb2b5f895565ef84d62502bcdfce502fe2d470c12"} Sep 30 03:11:49 crc kubenswrapper[4744]: I0930 03:11:49.770746 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:11:49 crc kubenswrapper[4744]: I0930 03:11:49.851462 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2sk2r"] Sep 30 03:11:49 crc kubenswrapper[4744]: I0930 03:11:49.852306 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" podUID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerName="dnsmasq-dns" containerID="cri-o://fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6" gracePeriod=10 Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.094794 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c75vh" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.198058 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2j85\" (UniqueName: \"kubernetes.io/projected/e1a89fd1-cf25-4278-974d-e3d51a3ee539-kube-api-access-t2j85\") pod \"e1a89fd1-cf25-4278-974d-e3d51a3ee539\" (UID: \"e1a89fd1-cf25-4278-974d-e3d51a3ee539\") " Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.246239 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a89fd1-cf25-4278-974d-e3d51a3ee539-kube-api-access-t2j85" (OuterVolumeSpecName: "kube-api-access-t2j85") pod "e1a89fd1-cf25-4278-974d-e3d51a3ee539" (UID: "e1a89fd1-cf25-4278-974d-e3d51a3ee539"). InnerVolumeSpecName "kube-api-access-t2j85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.300923 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2j85\" (UniqueName: \"kubernetes.io/projected/e1a89fd1-cf25-4278-974d-e3d51a3ee539-kube-api-access-t2j85\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.362142 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.401602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-config\") pod \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.401676 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kws52\" (UniqueName: \"kubernetes.io/projected/75824f59-9f8b-46d5-ad6d-668acf23a1b2-kube-api-access-kws52\") pod \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.401721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-sb\") pod \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.401812 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-dns-svc\") pod \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.401901 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-nb\") pod \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\" (UID: \"75824f59-9f8b-46d5-ad6d-668acf23a1b2\") " Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.409563 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75824f59-9f8b-46d5-ad6d-668acf23a1b2-kube-api-access-kws52" (OuterVolumeSpecName: "kube-api-access-kws52") pod "75824f59-9f8b-46d5-ad6d-668acf23a1b2" (UID: "75824f59-9f8b-46d5-ad6d-668acf23a1b2"). InnerVolumeSpecName "kube-api-access-kws52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.437536 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-config" (OuterVolumeSpecName: "config") pod "75824f59-9f8b-46d5-ad6d-668acf23a1b2" (UID: "75824f59-9f8b-46d5-ad6d-668acf23a1b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.440690 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75824f59-9f8b-46d5-ad6d-668acf23a1b2" (UID: "75824f59-9f8b-46d5-ad6d-668acf23a1b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.442230 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75824f59-9f8b-46d5-ad6d-668acf23a1b2" (UID: "75824f59-9f8b-46d5-ad6d-668acf23a1b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.444095 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75824f59-9f8b-46d5-ad6d-668acf23a1b2" (UID: "75824f59-9f8b-46d5-ad6d-668acf23a1b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.503749 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.503779 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.503790 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kws52\" (UniqueName: \"kubernetes.io/projected/75824f59-9f8b-46d5-ad6d-668acf23a1b2-kube-api-access-kws52\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.503800 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.503809 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75824f59-9f8b-46d5-ad6d-668acf23a1b2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.647447 4744 generic.go:334] "Generic (PLEG): container finished" podID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerID="fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6" exitCode=0 Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.647553 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" event={"ID":"75824f59-9f8b-46d5-ad6d-668acf23a1b2","Type":"ContainerDied","Data":"fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6"} Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.647547 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.648069 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2sk2r" event={"ID":"75824f59-9f8b-46d5-ad6d-668acf23a1b2","Type":"ContainerDied","Data":"c03f7344da0e18e5a3dd670c3017c00f91c76ba2e143a8c720cf7373def781ea"} Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.648111 4744 scope.go:117] "RemoveContainer" containerID="fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.653207 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c75vh" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.653280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c75vh" event={"ID":"e1a89fd1-cf25-4278-974d-e3d51a3ee539","Type":"ContainerDied","Data":"51f22522cda446ac0a761ea1ac2f4fd1e4711cc526139465aaa4179722394fce"} Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.653328 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51f22522cda446ac0a761ea1ac2f4fd1e4711cc526139465aaa4179722394fce" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.676531 4744 scope.go:117] "RemoveContainer" containerID="d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.719877 4744 scope.go:117] "RemoveContainer" containerID="fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6" Sep 30 03:11:50 crc kubenswrapper[4744]: E0930 03:11:50.720467 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6\": container with ID starting with fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6 not found: ID does not exist" containerID="fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.720531 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6"} err="failed to get container status \"fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6\": rpc error: code = NotFound desc = could not find container \"fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6\": container with ID starting with fd9072b50467fb563aed834eb66e0ad58ca17d753654bd9c2031d83c13d6e7b6 not found: ID does not exist" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.720580 4744 scope.go:117] "RemoveContainer" containerID="d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.720724 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2sk2r"] Sep 30 03:11:50 crc kubenswrapper[4744]: E0930 03:11:50.721006 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5\": container with ID starting with d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5 not found: ID does not exist" containerID="d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.721052 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5"} err="failed to get container status \"d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5\": rpc error: code = NotFound desc = could not find container \"d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5\": container with ID starting with d92539ecb499cbd688bcc855465a44ee01e89b0dbfd6eeedeba6d2f41abebba5 not found: ID does not exist" Sep 30 03:11:50 crc kubenswrapper[4744]: I0930 03:11:50.726603 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2sk2r"] Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.144997 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hnh57" Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.217091 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttn9b\" (UniqueName: \"kubernetes.io/projected/de338fd0-f1cc-4fcb-b690-6777c2da57ce-kube-api-access-ttn9b\") pod \"de338fd0-f1cc-4fcb-b690-6777c2da57ce\" (UID: \"de338fd0-f1cc-4fcb-b690-6777c2da57ce\") " Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.226551 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de338fd0-f1cc-4fcb-b690-6777c2da57ce-kube-api-access-ttn9b" (OuterVolumeSpecName: "kube-api-access-ttn9b") pod "de338fd0-f1cc-4fcb-b690-6777c2da57ce" (UID: "de338fd0-f1cc-4fcb-b690-6777c2da57ce"). InnerVolumeSpecName "kube-api-access-ttn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.319344 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttn9b\" (UniqueName: \"kubernetes.io/projected/de338fd0-f1cc-4fcb-b690-6777c2da57ce-kube-api-access-ttn9b\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.517317 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" path="/var/lib/kubelet/pods/75824f59-9f8b-46d5-ad6d-668acf23a1b2/volumes" Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.665008 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hnh57" event={"ID":"de338fd0-f1cc-4fcb-b690-6777c2da57ce","Type":"ContainerDied","Data":"e3ee3213f0d6b160718ab7a7a5edcd98b7606822481396e66c04c0ae2aeec922"} Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.666030 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3ee3213f0d6b160718ab7a7a5edcd98b7606822481396e66c04c0ae2aeec922" Sep 30 03:11:51 crc kubenswrapper[4744]: I0930 03:11:51.666259 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hnh57" Sep 30 03:11:52 crc kubenswrapper[4744]: I0930 03:11:52.682341 4744 generic.go:334] "Generic (PLEG): container finished" podID="35a9b94f-3d1b-40a3-9bcf-279d796e86d9" containerID="d786b8203ed48954b3dfe359d2019131f5c49db5b816b0dad20e1ab82c6e8753" exitCode=0 Sep 30 03:11:52 crc kubenswrapper[4744]: I0930 03:11:52.682427 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qcpwz" event={"ID":"35a9b94f-3d1b-40a3-9bcf-279d796e86d9","Type":"ContainerDied","Data":"d786b8203ed48954b3dfe359d2019131f5c49db5b816b0dad20e1ab82c6e8753"} Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.101588 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.177918 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgj5n\" (UniqueName: \"kubernetes.io/projected/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-kube-api-access-dgj5n\") pod \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.177986 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-etc-swift\") pod \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.178034 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-swiftconf\") pod \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.179160 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-scripts\") pod \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.179424 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-combined-ca-bundle\") pod \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.179479 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-dispersionconf\") pod \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.179515 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-ring-data-devices\") pod \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\" (UID: \"35a9b94f-3d1b-40a3-9bcf-279d796e86d9\") " Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.180042 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "35a9b94f-3d1b-40a3-9bcf-279d796e86d9" (UID: "35a9b94f-3d1b-40a3-9bcf-279d796e86d9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.180761 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "35a9b94f-3d1b-40a3-9bcf-279d796e86d9" (UID: "35a9b94f-3d1b-40a3-9bcf-279d796e86d9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.185578 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-kube-api-access-dgj5n" (OuterVolumeSpecName: "kube-api-access-dgj5n") pod "35a9b94f-3d1b-40a3-9bcf-279d796e86d9" (UID: "35a9b94f-3d1b-40a3-9bcf-279d796e86d9"). InnerVolumeSpecName "kube-api-access-dgj5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.188492 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "35a9b94f-3d1b-40a3-9bcf-279d796e86d9" (UID: "35a9b94f-3d1b-40a3-9bcf-279d796e86d9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.205086 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-scripts" (OuterVolumeSpecName: "scripts") pod "35a9b94f-3d1b-40a3-9bcf-279d796e86d9" (UID: "35a9b94f-3d1b-40a3-9bcf-279d796e86d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.209899 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "35a9b94f-3d1b-40a3-9bcf-279d796e86d9" (UID: "35a9b94f-3d1b-40a3-9bcf-279d796e86d9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.211889 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35a9b94f-3d1b-40a3-9bcf-279d796e86d9" (UID: "35a9b94f-3d1b-40a3-9bcf-279d796e86d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.281813 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.281853 4744 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.281865 4744 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.281877 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgj5n\" (UniqueName: \"kubernetes.io/projected/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-kube-api-access-dgj5n\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.281895 4744 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.281907 4744 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.281919 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35a9b94f-3d1b-40a3-9bcf-279d796e86d9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.698533 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qcpwz" event={"ID":"35a9b94f-3d1b-40a3-9bcf-279d796e86d9","Type":"ContainerDied","Data":"c03057c01c6beb654ceabc5f8e6cf5d73eff25a23ffbde009d81be6cdbce6e39"} Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.698827 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03057c01c6beb654ceabc5f8e6cf5d73eff25a23ffbde009d81be6cdbce6e39" Sep 30 03:11:54 crc kubenswrapper[4744]: I0930 03:11:54.698576 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qcpwz" Sep 30 03:11:55 crc kubenswrapper[4744]: I0930 03:11:55.712796 4744 generic.go:334] "Generic (PLEG): container finished" podID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerID="f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430" exitCode=0 Sep 30 03:11:55 crc kubenswrapper[4744]: I0930 03:11:55.712917 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d2d0096-8154-4723-aa53-80eaeb9e4d32","Type":"ContainerDied","Data":"f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430"} Sep 30 03:11:55 crc kubenswrapper[4744]: I0930 03:11:55.716746 4744 generic.go:334] "Generic (PLEG): container finished" podID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerID="b47da698457c41ad15230a0e3bf737f8c71f90153142d8f17152c9157c30ffd4" exitCode=0 Sep 30 03:11:55 crc kubenswrapper[4744]: I0930 03:11:55.716860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79aeb9a3-f29e-49f0-af59-ae29868cc21e","Type":"ContainerDied","Data":"b47da698457c41ad15230a0e3bf737f8c71f90153142d8f17152c9157c30ffd4"} Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.627401 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.647456 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6391d7af-84fd-42ee-ac86-399fa13725de-etc-swift\") pod \"swift-storage-0\" (UID: \"6391d7af-84fd-42ee-ac86-399fa13725de\") " pod="openstack/swift-storage-0" Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.725961 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d2d0096-8154-4723-aa53-80eaeb9e4d32","Type":"ContainerStarted","Data":"e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad"} Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.726499 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.727902 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79aeb9a3-f29e-49f0-af59-ae29868cc21e","Type":"ContainerStarted","Data":"247926f8f3cadac7f088a36ccd7fa8f1c667eaac56ae4420f2a8ec7c7c5b1c5b"} Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.728261 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.761503 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=47.730371878 podStartE2EDuration="54.761488385s" podCreationTimestamp="2025-09-30 03:11:02 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.237561753 +0000 UTC m=+1003.410781727" lastFinishedPulling="2025-09-30 03:11:23.26867825 +0000 UTC m=+1010.441898234" observedRunningTime="2025-09-30 03:11:56.754930161 +0000 UTC m=+1043.928150135" watchObservedRunningTime="2025-09-30 03:11:56.761488385 +0000 UTC m=+1043.934708359" Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.783856 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.780329078 podStartE2EDuration="53.783838388s" podCreationTimestamp="2025-09-30 03:11:03 +0000 UTC" firstStartedPulling="2025-09-30 03:11:16.264438947 +0000 UTC m=+1003.437658921" lastFinishedPulling="2025-09-30 03:11:23.267948267 +0000 UTC m=+1010.441168231" observedRunningTime="2025-09-30 03:11:56.771407582 +0000 UTC m=+1043.944627556" watchObservedRunningTime="2025-09-30 03:11:56.783838388 +0000 UTC m=+1043.957058362" Sep 30 03:11:56 crc kubenswrapper[4744]: I0930 03:11:56.907711 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.319528 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3f9a-account-create-jztrn"] Sep 30 03:11:57 crc kubenswrapper[4744]: E0930 03:11:57.320103 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a9b94f-3d1b-40a3-9bcf-279d796e86d9" containerName="swift-ring-rebalance" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320117 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a9b94f-3d1b-40a3-9bcf-279d796e86d9" containerName="swift-ring-rebalance" Sep 30 03:11:57 crc kubenswrapper[4744]: E0930 03:11:57.320131 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a89fd1-cf25-4278-974d-e3d51a3ee539" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320140 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a89fd1-cf25-4278-974d-e3d51a3ee539" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: E0930 03:11:57.320154 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de338fd0-f1cc-4fcb-b690-6777c2da57ce" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320162 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="de338fd0-f1cc-4fcb-b690-6777c2da57ce" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: E0930 03:11:57.320175 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerName="init" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320182 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerName="init" Sep 30 03:11:57 crc kubenswrapper[4744]: E0930 03:11:57.320195 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerName="dnsmasq-dns" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320201 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerName="dnsmasq-dns" Sep 30 03:11:57 crc kubenswrapper[4744]: E0930 03:11:57.320219 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455eafbf-df1e-4746-ad86-5959ce329b4e" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320225 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="455eafbf-df1e-4746-ad86-5959ce329b4e" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320417 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="de338fd0-f1cc-4fcb-b690-6777c2da57ce" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320440 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="75824f59-9f8b-46d5-ad6d-668acf23a1b2" containerName="dnsmasq-dns" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320451 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="455eafbf-df1e-4746-ad86-5959ce329b4e" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320460 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a89fd1-cf25-4278-974d-e3d51a3ee539" containerName="mariadb-database-create" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.320472 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a9b94f-3d1b-40a3-9bcf-279d796e86d9" containerName="swift-ring-rebalance" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.323427 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f9a-account-create-jztrn" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.325615 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.330608 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3f9a-account-create-jztrn"] Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.457206 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqhzb\" (UniqueName: \"kubernetes.io/projected/dee108d7-7bde-43d7-8ae6-998fca50beda-kube-api-access-sqhzb\") pod \"keystone-3f9a-account-create-jztrn\" (UID: \"dee108d7-7bde-43d7-8ae6-998fca50beda\") " pod="openstack/keystone-3f9a-account-create-jztrn" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.535426 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1cd7-account-create-sghlw"] Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.536346 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1cd7-account-create-sghlw" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.540286 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.547689 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1cd7-account-create-sghlw"] Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.558733 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqhzb\" (UniqueName: \"kubernetes.io/projected/dee108d7-7bde-43d7-8ae6-998fca50beda-kube-api-access-sqhzb\") pod \"keystone-3f9a-account-create-jztrn\" (UID: \"dee108d7-7bde-43d7-8ae6-998fca50beda\") " pod="openstack/keystone-3f9a-account-create-jztrn" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.583060 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqhzb\" (UniqueName: \"kubernetes.io/projected/dee108d7-7bde-43d7-8ae6-998fca50beda-kube-api-access-sqhzb\") pod \"keystone-3f9a-account-create-jztrn\" (UID: \"dee108d7-7bde-43d7-8ae6-998fca50beda\") " pod="openstack/keystone-3f9a-account-create-jztrn" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.655916 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f9a-account-create-jztrn" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.660878 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mphg\" (UniqueName: \"kubernetes.io/projected/e2ed2a0b-0154-4130-b676-15608db7b540-kube-api-access-7mphg\") pod \"placement-1cd7-account-create-sghlw\" (UID: \"e2ed2a0b-0154-4130-b676-15608db7b540\") " pod="openstack/placement-1cd7-account-create-sghlw" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.686500 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 03:11:57 crc kubenswrapper[4744]: W0930 03:11:57.696501 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6391d7af_84fd_42ee_ac86_399fa13725de.slice/crio-31be11a715d4f64b1cb199f003ae36e5ec9828405e3fda287c0b83b29362d668 WatchSource:0}: Error finding container 31be11a715d4f64b1cb199f003ae36e5ec9828405e3fda287c0b83b29362d668: Status 404 returned error can't find the container with id 31be11a715d4f64b1cb199f003ae36e5ec9828405e3fda287c0b83b29362d668 Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.738554 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"31be11a715d4f64b1cb199f003ae36e5ec9828405e3fda287c0b83b29362d668"} Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.762083 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mphg\" (UniqueName: \"kubernetes.io/projected/e2ed2a0b-0154-4130-b676-15608db7b540-kube-api-access-7mphg\") pod \"placement-1cd7-account-create-sghlw\" (UID: \"e2ed2a0b-0154-4130-b676-15608db7b540\") " pod="openstack/placement-1cd7-account-create-sghlw" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.780333 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mphg\" (UniqueName: \"kubernetes.io/projected/e2ed2a0b-0154-4130-b676-15608db7b540-kube-api-access-7mphg\") pod \"placement-1cd7-account-create-sghlw\" (UID: \"e2ed2a0b-0154-4130-b676-15608db7b540\") " pod="openstack/placement-1cd7-account-create-sghlw" Sep 30 03:11:57 crc kubenswrapper[4744]: I0930 03:11:57.861106 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1cd7-account-create-sghlw" Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.081466 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3f9a-account-create-jztrn"] Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.089962 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1cd7-account-create-sghlw"] Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.643609 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m95jr" podUID="6aa7757e-eced-4195-8b1d-88fd7a3b322d" containerName="ovn-controller" probeResult="failure" output=< Sep 30 03:11:58 crc kubenswrapper[4744]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 03:11:58 crc kubenswrapper[4744]: > Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.716921 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.723163 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-t9l7c" Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.749019 4744 generic.go:334] "Generic (PLEG): container finished" podID="dee108d7-7bde-43d7-8ae6-998fca50beda" containerID="3fdbce881c392195b16267d99262d0baf566d2edaeeff38e5061e191b28d3606" exitCode=0 Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.749117 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3f9a-account-create-jztrn" event={"ID":"dee108d7-7bde-43d7-8ae6-998fca50beda","Type":"ContainerDied","Data":"3fdbce881c392195b16267d99262d0baf566d2edaeeff38e5061e191b28d3606"} Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.749171 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3f9a-account-create-jztrn" event={"ID":"dee108d7-7bde-43d7-8ae6-998fca50beda","Type":"ContainerStarted","Data":"d58573b1b6e856f5fae25dff04faede5fb723f6f64ba103f159667986b395fd3"} Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.753401 4744 generic.go:334] "Generic (PLEG): container finished" podID="e2ed2a0b-0154-4130-b676-15608db7b540" containerID="3be0d944db0f753421ffb9b4a3884f87421a49ff54d5be7d33bbe428d030c27f" exitCode=0 Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.753483 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1cd7-account-create-sghlw" event={"ID":"e2ed2a0b-0154-4130-b676-15608db7b540","Type":"ContainerDied","Data":"3be0d944db0f753421ffb9b4a3884f87421a49ff54d5be7d33bbe428d030c27f"} Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.753522 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1cd7-account-create-sghlw" event={"ID":"e2ed2a0b-0154-4130-b676-15608db7b540","Type":"ContainerStarted","Data":"15d9eee89cf9ed36c4a75a6f1d803c648a1ad4e378eaacdf387c45e0f8ab5b6b"} Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.948067 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m95jr-config-drgzk"] Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.949127 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.954593 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 03:11:58 crc kubenswrapper[4744]: I0930 03:11:58.965769 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m95jr-config-drgzk"] Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.081758 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-scripts\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.081818 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.081850 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-log-ovn\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.081891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run-ovn\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.081917 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-additional-scripts\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.082072 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgxrf\" (UniqueName: \"kubernetes.io/projected/f0093b29-257e-429c-85a3-e8301a5db825-kube-api-access-bgxrf\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.183086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-scripts\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.183357 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.183627 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-log-ovn\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.183657 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run-ovn\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.183676 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-additional-scripts\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.183700 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgxrf\" (UniqueName: \"kubernetes.io/projected/f0093b29-257e-429c-85a3-e8301a5db825-kube-api-access-bgxrf\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.184245 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.184286 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-log-ovn\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.184321 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run-ovn\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.184872 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-additional-scripts\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.186024 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-scripts\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.202970 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgxrf\" (UniqueName: \"kubernetes.io/projected/f0093b29-257e-429c-85a3-e8301a5db825-kube-api-access-bgxrf\") pod \"ovn-controller-m95jr-config-drgzk\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.266869 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.803132 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"c6c5b70e4c6ed2e09983d80694842cd4a7170b6edf529af9aa345ed0789bd4ba"} Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.803498 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"60925002a4528693cd778df939c75408930b5c5e2005a2ed94451aaed3656211"} Sep 30 03:11:59 crc kubenswrapper[4744]: I0930 03:11:59.821224 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m95jr-config-drgzk"] Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.077717 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f9a-account-create-jztrn" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.183688 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1cd7-account-create-sghlw" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.205771 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqhzb\" (UniqueName: \"kubernetes.io/projected/dee108d7-7bde-43d7-8ae6-998fca50beda-kube-api-access-sqhzb\") pod \"dee108d7-7bde-43d7-8ae6-998fca50beda\" (UID: \"dee108d7-7bde-43d7-8ae6-998fca50beda\") " Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.212361 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee108d7-7bde-43d7-8ae6-998fca50beda-kube-api-access-sqhzb" (OuterVolumeSpecName: "kube-api-access-sqhzb") pod "dee108d7-7bde-43d7-8ae6-998fca50beda" (UID: "dee108d7-7bde-43d7-8ae6-998fca50beda"). InnerVolumeSpecName "kube-api-access-sqhzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.307894 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mphg\" (UniqueName: \"kubernetes.io/projected/e2ed2a0b-0154-4130-b676-15608db7b540-kube-api-access-7mphg\") pod \"e2ed2a0b-0154-4130-b676-15608db7b540\" (UID: \"e2ed2a0b-0154-4130-b676-15608db7b540\") " Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.308294 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqhzb\" (UniqueName: \"kubernetes.io/projected/dee108d7-7bde-43d7-8ae6-998fca50beda-kube-api-access-sqhzb\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.312576 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ed2a0b-0154-4130-b676-15608db7b540-kube-api-access-7mphg" (OuterVolumeSpecName: "kube-api-access-7mphg") pod "e2ed2a0b-0154-4130-b676-15608db7b540" (UID: "e2ed2a0b-0154-4130-b676-15608db7b540"). InnerVolumeSpecName "kube-api-access-7mphg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.409757 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mphg\" (UniqueName: \"kubernetes.io/projected/e2ed2a0b-0154-4130-b676-15608db7b540-kube-api-access-7mphg\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.812982 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3f9a-account-create-jztrn" event={"ID":"dee108d7-7bde-43d7-8ae6-998fca50beda","Type":"ContainerDied","Data":"d58573b1b6e856f5fae25dff04faede5fb723f6f64ba103f159667986b395fd3"} Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.813245 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58573b1b6e856f5fae25dff04faede5fb723f6f64ba103f159667986b395fd3" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.813001 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f9a-account-create-jztrn" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.814129 4744 generic.go:334] "Generic (PLEG): container finished" podID="f0093b29-257e-429c-85a3-e8301a5db825" containerID="e631872fe00b62bf45d049cdcceac111c1486239a0419fb604ef4c6fa813737f" exitCode=0 Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.814187 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr-config-drgzk" event={"ID":"f0093b29-257e-429c-85a3-e8301a5db825","Type":"ContainerDied","Data":"e631872fe00b62bf45d049cdcceac111c1486239a0419fb604ef4c6fa813737f"} Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.814213 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr-config-drgzk" event={"ID":"f0093b29-257e-429c-85a3-e8301a5db825","Type":"ContainerStarted","Data":"0d2ae0a2f526d0fa77fbe2c6ddde6c0fc9243465126a06099c7142f9bd34bac7"} Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.815209 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1cd7-account-create-sghlw" event={"ID":"e2ed2a0b-0154-4130-b676-15608db7b540","Type":"ContainerDied","Data":"15d9eee89cf9ed36c4a75a6f1d803c648a1ad4e378eaacdf387c45e0f8ab5b6b"} Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.815243 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15d9eee89cf9ed36c4a75a6f1d803c648a1ad4e378eaacdf387c45e0f8ab5b6b" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.815217 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1cd7-account-create-sghlw" Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.817471 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"1af893527d6fb1118c5257b7c3f672c111dc7429041261629169d859b0f5f432"} Sep 30 03:12:00 crc kubenswrapper[4744]: I0930 03:12:00.817511 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"196ccc6e24a8789920ccb0b70f2b7e7b4a441d0d2f55a8370398cac09fcf9727"} Sep 30 03:12:01 crc kubenswrapper[4744]: I0930 03:12:01.829978 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"31c4830d5689505ae1652744d600d12ac3e293301e65a8e3a01eb144a7836446"} Sep 30 03:12:01 crc kubenswrapper[4744]: I0930 03:12:01.831029 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"15badc84c9fb5d7b60730f5637f79b5a4aa760f727b6ef8f9679bcd576447f4f"} Sep 30 03:12:01 crc kubenswrapper[4744]: I0930 03:12:01.831116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"b73f6c361f8c619879388b9289a67b5fbb8eb75099a10b824a713355dfbb27ff"} Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.109847 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242276 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run-ovn\") pod \"f0093b29-257e-429c-85a3-e8301a5db825\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242319 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-log-ovn\") pod \"f0093b29-257e-429c-85a3-e8301a5db825\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242352 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgxrf\" (UniqueName: \"kubernetes.io/projected/f0093b29-257e-429c-85a3-e8301a5db825-kube-api-access-bgxrf\") pod \"f0093b29-257e-429c-85a3-e8301a5db825\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242390 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-scripts\") pod \"f0093b29-257e-429c-85a3-e8301a5db825\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242413 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f0093b29-257e-429c-85a3-e8301a5db825" (UID: "f0093b29-257e-429c-85a3-e8301a5db825"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run\") pod \"f0093b29-257e-429c-85a3-e8301a5db825\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242472 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f0093b29-257e-429c-85a3-e8301a5db825" (UID: "f0093b29-257e-429c-85a3-e8301a5db825"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242511 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-additional-scripts\") pod \"f0093b29-257e-429c-85a3-e8301a5db825\" (UID: \"f0093b29-257e-429c-85a3-e8301a5db825\") " Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run" (OuterVolumeSpecName: "var-run") pod "f0093b29-257e-429c-85a3-e8301a5db825" (UID: "f0093b29-257e-429c-85a3-e8301a5db825"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242795 4744 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242806 4744 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.242814 4744 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0093b29-257e-429c-85a3-e8301a5db825-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.243461 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f0093b29-257e-429c-85a3-e8301a5db825" (UID: "f0093b29-257e-429c-85a3-e8301a5db825"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.243876 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-scripts" (OuterVolumeSpecName: "scripts") pod "f0093b29-257e-429c-85a3-e8301a5db825" (UID: "f0093b29-257e-429c-85a3-e8301a5db825"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.247495 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0093b29-257e-429c-85a3-e8301a5db825-kube-api-access-bgxrf" (OuterVolumeSpecName: "kube-api-access-bgxrf") pod "f0093b29-257e-429c-85a3-e8301a5db825" (UID: "f0093b29-257e-429c-85a3-e8301a5db825"). InnerVolumeSpecName "kube-api-access-bgxrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.345605 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgxrf\" (UniqueName: \"kubernetes.io/projected/f0093b29-257e-429c-85a3-e8301a5db825-kube-api-access-bgxrf\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.345941 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.345964 4744 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f0093b29-257e-429c-85a3-e8301a5db825-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.836435 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8331-account-create-gqdbp"] Sep 30 03:12:02 crc kubenswrapper[4744]: E0930 03:12:02.837078 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0093b29-257e-429c-85a3-e8301a5db825" containerName="ovn-config" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.837096 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0093b29-257e-429c-85a3-e8301a5db825" containerName="ovn-config" Sep 30 03:12:02 crc kubenswrapper[4744]: E0930 03:12:02.837113 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee108d7-7bde-43d7-8ae6-998fca50beda" containerName="mariadb-account-create" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.837122 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee108d7-7bde-43d7-8ae6-998fca50beda" containerName="mariadb-account-create" Sep 30 03:12:02 crc kubenswrapper[4744]: E0930 03:12:02.837146 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ed2a0b-0154-4130-b676-15608db7b540" containerName="mariadb-account-create" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.837155 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ed2a0b-0154-4130-b676-15608db7b540" containerName="mariadb-account-create" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.837355 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ed2a0b-0154-4130-b676-15608db7b540" containerName="mariadb-account-create" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.837424 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0093b29-257e-429c-85a3-e8301a5db825" containerName="ovn-config" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.837442 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee108d7-7bde-43d7-8ae6-998fca50beda" containerName="mariadb-account-create" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.838345 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8331-account-create-gqdbp" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.842521 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8331-account-create-gqdbp"] Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.884139 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.885566 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-drgzk" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.885599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr-config-drgzk" event={"ID":"f0093b29-257e-429c-85a3-e8301a5db825","Type":"ContainerDied","Data":"0d2ae0a2f526d0fa77fbe2c6ddde6c0fc9243465126a06099c7142f9bd34bac7"} Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.885647 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d2ae0a2f526d0fa77fbe2c6ddde6c0fc9243465126a06099c7142f9bd34bac7" Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.899108 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"3b85f178ca6a4020472fbfb37dc9519d3ff760c72829198a38506f6db46f21cb"} Sep 30 03:12:02 crc kubenswrapper[4744]: I0930 03:12:02.986276 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp4nm\" (UniqueName: \"kubernetes.io/projected/d14222f9-2d27-470f-938b-95b8176185ba-kube-api-access-wp4nm\") pod \"glance-8331-account-create-gqdbp\" (UID: \"d14222f9-2d27-470f-938b-95b8176185ba\") " pod="openstack/glance-8331-account-create-gqdbp" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.087830 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp4nm\" (UniqueName: \"kubernetes.io/projected/d14222f9-2d27-470f-938b-95b8176185ba-kube-api-access-wp4nm\") pod \"glance-8331-account-create-gqdbp\" (UID: \"d14222f9-2d27-470f-938b-95b8176185ba\") " pod="openstack/glance-8331-account-create-gqdbp" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.104724 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp4nm\" (UniqueName: \"kubernetes.io/projected/d14222f9-2d27-470f-938b-95b8176185ba-kube-api-access-wp4nm\") pod \"glance-8331-account-create-gqdbp\" (UID: \"d14222f9-2d27-470f-938b-95b8176185ba\") " pod="openstack/glance-8331-account-create-gqdbp" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.201293 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8331-account-create-gqdbp" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.223903 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m95jr-config-drgzk"] Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.229139 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m95jr-config-drgzk"] Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.362103 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m95jr-config-7ss2p"] Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.363080 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.365677 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.376095 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m95jr-config-7ss2p"] Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.493160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.493245 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-log-ovn\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.493296 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-additional-scripts\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.493319 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run-ovn\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.493346 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-scripts\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.493392 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkxtf\" (UniqueName: \"kubernetes.io/projected/3eebd2f6-1365-4d77-bc1d-aa19d881a346-kube-api-access-pkxtf\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.515816 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0093b29-257e-429c-85a3-e8301a5db825" path="/var/lib/kubelet/pods/f0093b29-257e-429c-85a3-e8301a5db825/volumes" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595076 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595169 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-log-ovn\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595244 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-additional-scripts\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595271 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run-ovn\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595303 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-scripts\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595337 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkxtf\" (UniqueName: \"kubernetes.io/projected/3eebd2f6-1365-4d77-bc1d-aa19d881a346-kube-api-access-pkxtf\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595470 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.595470 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run-ovn\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.596146 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-log-ovn\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.596553 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-additional-scripts\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.597544 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-scripts\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.629687 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkxtf\" (UniqueName: \"kubernetes.io/projected/3eebd2f6-1365-4d77-bc1d-aa19d881a346-kube-api-access-pkxtf\") pod \"ovn-controller-m95jr-config-7ss2p\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.631681 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-m95jr" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.681078 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.699613 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8331-account-create-gqdbp"] Sep 30 03:12:03 crc kubenswrapper[4744]: W0930 03:12:03.704344 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd14222f9_2d27_470f_938b_95b8176185ba.slice/crio-193e5ff3f14864c4283dc4df1e41f7b308ea8f3a9489a2b0e6532e9f9646fcd4 WatchSource:0}: Error finding container 193e5ff3f14864c4283dc4df1e41f7b308ea8f3a9489a2b0e6532e9f9646fcd4: Status 404 returned error can't find the container with id 193e5ff3f14864c4283dc4df1e41f7b308ea8f3a9489a2b0e6532e9f9646fcd4 Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.966115 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"fca322cb2eb57b93fc4e79e36021d3cb62b07b0887e7562b8dfa94ce09e05486"} Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.966491 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"0aeeaceca4e60f3a7a7791fc5225a464d6b9464e81996f7ede86f2058dc067b2"} Sep 30 03:12:03 crc kubenswrapper[4744]: I0930 03:12:03.976477 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8331-account-create-gqdbp" event={"ID":"d14222f9-2d27-470f-938b-95b8176185ba","Type":"ContainerStarted","Data":"193e5ff3f14864c4283dc4df1e41f7b308ea8f3a9489a2b0e6532e9f9646fcd4"} Sep 30 03:12:04 crc kubenswrapper[4744]: I0930 03:12:04.276335 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m95jr-config-7ss2p"] Sep 30 03:12:04 crc kubenswrapper[4744]: I0930 03:12:04.997858 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"e69d877a01556bc2c4f6e983f705f9b7713723c8ce6eefd4449605dae7ef8996"} Sep 30 03:12:04 crc kubenswrapper[4744]: I0930 03:12:04.999517 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr-config-7ss2p" event={"ID":"3eebd2f6-1365-4d77-bc1d-aa19d881a346","Type":"ContainerStarted","Data":"04d30fdc032cce32576daf5566b30482d3a3dbf3ae397d923e7df171227bd856"} Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.020946 4744 generic.go:334] "Generic (PLEG): container finished" podID="d14222f9-2d27-470f-938b-95b8176185ba" containerID="fbae2c1ec3e1b84d16b40aa322c78a32377c67e7e61aa749e4aa9dab39f03622" exitCode=0 Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.021059 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8331-account-create-gqdbp" event={"ID":"d14222f9-2d27-470f-938b-95b8176185ba","Type":"ContainerDied","Data":"fbae2c1ec3e1b84d16b40aa322c78a32377c67e7e61aa749e4aa9dab39f03622"} Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.037180 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"4106619adb8f1748e2c8b4bd3c4f62f0d8f94716ab9e30760b4c9248af4a177a"} Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.037273 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"7c242b44afe3d7ee450caeb179d58bec32e563b3affe54102cd741932e843581"} Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.037298 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"c50b9f94d1e637091c00a4421935dbc18212b036acffcf1b911d6bbc6851b501"} Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.037318 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6391d7af-84fd-42ee-ac86-399fa13725de","Type":"ContainerStarted","Data":"f213b71a61258429f82cb3f5bd5135bb37f0f84564004c8ed3481fee00eaad2c"} Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.040431 4744 generic.go:334] "Generic (PLEG): container finished" podID="3eebd2f6-1365-4d77-bc1d-aa19d881a346" containerID="d9552bee01f64bff75395f2036cae62211d09736edb53991f1d5a0ce1386546d" exitCode=0 Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.040483 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr-config-7ss2p" event={"ID":"3eebd2f6-1365-4d77-bc1d-aa19d881a346","Type":"ContainerDied","Data":"d9552bee01f64bff75395f2036cae62211d09736edb53991f1d5a0ce1386546d"} Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.077128 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.773033663 podStartE2EDuration="27.077103357s" podCreationTimestamp="2025-09-30 03:11:39 +0000 UTC" firstStartedPulling="2025-09-30 03:11:57.698901868 +0000 UTC m=+1044.872121842" lastFinishedPulling="2025-09-30 03:12:03.002971562 +0000 UTC m=+1050.176191536" observedRunningTime="2025-09-30 03:12:06.073710872 +0000 UTC m=+1053.246930846" watchObservedRunningTime="2025-09-30 03:12:06.077103357 +0000 UTC m=+1053.250323361" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.407407 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-gnvkr"] Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.409310 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.411705 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.430272 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-gnvkr"] Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.551847 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clw78\" (UniqueName: \"kubernetes.io/projected/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-kube-api-access-clw78\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.551963 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.552018 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-config\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.552045 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.552092 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.552263 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.653498 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clw78\" (UniqueName: \"kubernetes.io/projected/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-kube-api-access-clw78\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.653901 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.654020 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-config\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.654128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.654259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.654412 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.655469 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.655483 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.656011 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.656485 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-config\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.658035 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.677658 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clw78\" (UniqueName: \"kubernetes.io/projected/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-kube-api-access-clw78\") pod \"dnsmasq-dns-77585f5f8c-gnvkr\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:06 crc kubenswrapper[4744]: I0930 03:12:06.731790 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.084603 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-gnvkr"] Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.407896 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8331-account-create-gqdbp" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.430034 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.575904 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-log-ovn\") pod \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.576267 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-scripts\") pod \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.576349 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-additional-scripts\") pod \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.576399 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp4nm\" (UniqueName: \"kubernetes.io/projected/d14222f9-2d27-470f-938b-95b8176185ba-kube-api-access-wp4nm\") pod \"d14222f9-2d27-470f-938b-95b8176185ba\" (UID: \"d14222f9-2d27-470f-938b-95b8176185ba\") " Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.576420 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run\") pod \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.576456 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run-ovn\") pod \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.576535 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkxtf\" (UniqueName: \"kubernetes.io/projected/3eebd2f6-1365-4d77-bc1d-aa19d881a346-kube-api-access-pkxtf\") pod \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\" (UID: \"3eebd2f6-1365-4d77-bc1d-aa19d881a346\") " Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.578801 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run" (OuterVolumeSpecName: "var-run") pod "3eebd2f6-1365-4d77-bc1d-aa19d881a346" (UID: "3eebd2f6-1365-4d77-bc1d-aa19d881a346"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.579140 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3eebd2f6-1365-4d77-bc1d-aa19d881a346" (UID: "3eebd2f6-1365-4d77-bc1d-aa19d881a346"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.579205 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3eebd2f6-1365-4d77-bc1d-aa19d881a346" (UID: "3eebd2f6-1365-4d77-bc1d-aa19d881a346"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.579310 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3eebd2f6-1365-4d77-bc1d-aa19d881a346" (UID: "3eebd2f6-1365-4d77-bc1d-aa19d881a346"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.580331 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eebd2f6-1365-4d77-bc1d-aa19d881a346-kube-api-access-pkxtf" (OuterVolumeSpecName: "kube-api-access-pkxtf") pod "3eebd2f6-1365-4d77-bc1d-aa19d881a346" (UID: "3eebd2f6-1365-4d77-bc1d-aa19d881a346"). InnerVolumeSpecName "kube-api-access-pkxtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.580449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-scripts" (OuterVolumeSpecName: "scripts") pod "3eebd2f6-1365-4d77-bc1d-aa19d881a346" (UID: "3eebd2f6-1365-4d77-bc1d-aa19d881a346"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.586269 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14222f9-2d27-470f-938b-95b8176185ba-kube-api-access-wp4nm" (OuterVolumeSpecName: "kube-api-access-wp4nm") pod "d14222f9-2d27-470f-938b-95b8176185ba" (UID: "d14222f9-2d27-470f-938b-95b8176185ba"). InnerVolumeSpecName "kube-api-access-wp4nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.678796 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkxtf\" (UniqueName: \"kubernetes.io/projected/3eebd2f6-1365-4d77-bc1d-aa19d881a346-kube-api-access-pkxtf\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.678843 4744 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.678861 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.678876 4744 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3eebd2f6-1365-4d77-bc1d-aa19d881a346-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.678891 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp4nm\" (UniqueName: \"kubernetes.io/projected/d14222f9-2d27-470f-938b-95b8176185ba-kube-api-access-wp4nm\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.678908 4744 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:07 crc kubenswrapper[4744]: I0930 03:12:07.678921 4744 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3eebd2f6-1365-4d77-bc1d-aa19d881a346-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.064213 4744 generic.go:334] "Generic (PLEG): container finished" podID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerID="6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124" exitCode=0 Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.064307 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" event={"ID":"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315","Type":"ContainerDied","Data":"6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124"} Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.064346 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" event={"ID":"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315","Type":"ContainerStarted","Data":"d1267098634b6d137cb718556050380e39bed13a10e57569a79aec5fb9e3cab7"} Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.067437 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8331-account-create-gqdbp" Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.067414 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8331-account-create-gqdbp" event={"ID":"d14222f9-2d27-470f-938b-95b8176185ba","Type":"ContainerDied","Data":"193e5ff3f14864c4283dc4df1e41f7b308ea8f3a9489a2b0e6532e9f9646fcd4"} Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.067584 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="193e5ff3f14864c4283dc4df1e41f7b308ea8f3a9489a2b0e6532e9f9646fcd4" Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.069646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m95jr-config-7ss2p" event={"ID":"3eebd2f6-1365-4d77-bc1d-aa19d881a346","Type":"ContainerDied","Data":"04d30fdc032cce32576daf5566b30482d3a3dbf3ae397d923e7df171227bd856"} Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.069707 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d30fdc032cce32576daf5566b30482d3a3dbf3ae397d923e7df171227bd856" Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.069739 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m95jr-config-7ss2p" Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.517407 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m95jr-config-7ss2p"] Sep 30 03:12:08 crc kubenswrapper[4744]: I0930 03:12:08.526249 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m95jr-config-7ss2p"] Sep 30 03:12:09 crc kubenswrapper[4744]: I0930 03:12:09.085900 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" event={"ID":"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315","Type":"ContainerStarted","Data":"4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3"} Sep 30 03:12:09 crc kubenswrapper[4744]: I0930 03:12:09.086867 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:09 crc kubenswrapper[4744]: I0930 03:12:09.122480 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" podStartSLOduration=3.122450162 podStartE2EDuration="3.122450162s" podCreationTimestamp="2025-09-30 03:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:12:09.111506222 +0000 UTC m=+1056.284726206" watchObservedRunningTime="2025-09-30 03:12:09.122450162 +0000 UTC m=+1056.295670146" Sep 30 03:12:09 crc kubenswrapper[4744]: I0930 03:12:09.522141 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eebd2f6-1365-4d77-bc1d-aa19d881a346" path="/var/lib/kubelet/pods/3eebd2f6-1365-4d77-bc1d-aa19d881a346/volumes" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.007733 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rzbkf"] Sep 30 03:12:13 crc kubenswrapper[4744]: E0930 03:12:13.008748 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eebd2f6-1365-4d77-bc1d-aa19d881a346" containerName="ovn-config" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.008777 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eebd2f6-1365-4d77-bc1d-aa19d881a346" containerName="ovn-config" Sep 30 03:12:13 crc kubenswrapper[4744]: E0930 03:12:13.008815 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14222f9-2d27-470f-938b-95b8176185ba" containerName="mariadb-account-create" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.008832 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14222f9-2d27-470f-938b-95b8176185ba" containerName="mariadb-account-create" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.009116 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eebd2f6-1365-4d77-bc1d-aa19d881a346" containerName="ovn-config" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.009164 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14222f9-2d27-470f-938b-95b8176185ba" containerName="mariadb-account-create" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.009818 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.013652 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rt5lh" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.013774 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.019420 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rzbkf"] Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.110649 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-config-data\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.110794 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-combined-ca-bundle\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.110822 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-db-sync-config-data\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.110867 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqcdr\" (UniqueName: \"kubernetes.io/projected/dab1acfa-0313-4621-9d6e-6ab34807d0e5-kube-api-access-kqcdr\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.212524 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-config-data\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.212630 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-combined-ca-bundle\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.212653 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-db-sync-config-data\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.212713 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqcdr\" (UniqueName: \"kubernetes.io/projected/dab1acfa-0313-4621-9d6e-6ab34807d0e5-kube-api-access-kqcdr\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.218713 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-config-data\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.219116 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-combined-ca-bundle\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.227559 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-db-sync-config-data\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.235015 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqcdr\" (UniqueName: \"kubernetes.io/projected/dab1acfa-0313-4621-9d6e-6ab34807d0e5-kube-api-access-kqcdr\") pod \"glance-db-sync-rzbkf\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.340345 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:13 crc kubenswrapper[4744]: I0930 03:12:13.923304 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rzbkf"] Sep 30 03:12:14 crc kubenswrapper[4744]: I0930 03:12:14.095605 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:12:14 crc kubenswrapper[4744]: I0930 03:12:14.145266 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rzbkf" event={"ID":"dab1acfa-0313-4621-9d6e-6ab34807d0e5","Type":"ContainerStarted","Data":"122c23c906878a2495de0117a8648ecaa4014fa78803ee086b12152e1c05fc87"} Sep 30 03:12:14 crc kubenswrapper[4744]: I0930 03:12:14.403652 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.034841 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jt29c"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.036469 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jt29c" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.043725 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jt29c"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.131510 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-p2lq5"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.132705 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p2lq5" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.143522 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-p2lq5"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.184580 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q4bv\" (UniqueName: \"kubernetes.io/projected/e3e9b9bc-414a-4ddf-8757-6fd3ea97306e-kube-api-access-5q4bv\") pod \"barbican-db-create-p2lq5\" (UID: \"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e\") " pod="openstack/barbican-db-create-p2lq5" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.184658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bsd6\" (UniqueName: \"kubernetes.io/projected/8221af2a-5c5a-4b77-82da-57f20d0e50c7-kube-api-access-7bsd6\") pod \"cinder-db-create-jt29c\" (UID: \"8221af2a-5c5a-4b77-82da-57f20d0e50c7\") " pod="openstack/cinder-db-create-jt29c" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.224885 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-4mczb"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.225945 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4mczb" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.278999 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-4mczb"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.285754 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q4bv\" (UniqueName: \"kubernetes.io/projected/e3e9b9bc-414a-4ddf-8757-6fd3ea97306e-kube-api-access-5q4bv\") pod \"barbican-db-create-p2lq5\" (UID: \"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e\") " pod="openstack/barbican-db-create-p2lq5" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.285827 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bsd6\" (UniqueName: \"kubernetes.io/projected/8221af2a-5c5a-4b77-82da-57f20d0e50c7-kube-api-access-7bsd6\") pod \"cinder-db-create-jt29c\" (UID: \"8221af2a-5c5a-4b77-82da-57f20d0e50c7\") " pod="openstack/cinder-db-create-jt29c" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.285873 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf86l\" (UniqueName: \"kubernetes.io/projected/890bdf76-6137-4d6f-b87c-30f5e215cd21-kube-api-access-wf86l\") pod \"manila-db-create-4mczb\" (UID: \"890bdf76-6137-4d6f-b87c-30f5e215cd21\") " pod="openstack/manila-db-create-4mczb" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.289178 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vb2lg"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.290197 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.294093 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.294263 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g28pb" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.294389 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.294554 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.304054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bsd6\" (UniqueName: \"kubernetes.io/projected/8221af2a-5c5a-4b77-82da-57f20d0e50c7-kube-api-access-7bsd6\") pod \"cinder-db-create-jt29c\" (UID: \"8221af2a-5c5a-4b77-82da-57f20d0e50c7\") " pod="openstack/cinder-db-create-jt29c" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.311093 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vb2lg"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.311725 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q4bv\" (UniqueName: \"kubernetes.io/projected/e3e9b9bc-414a-4ddf-8757-6fd3ea97306e-kube-api-access-5q4bv\") pod \"barbican-db-create-p2lq5\" (UID: \"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e\") " pod="openstack/barbican-db-create-p2lq5" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.359093 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jt29c" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.387256 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwzbc\" (UniqueName: \"kubernetes.io/projected/5895c0e5-f156-4a12-952c-3690cf355178-kube-api-access-mwzbc\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.387318 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-config-data\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.387356 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-combined-ca-bundle\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.387426 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf86l\" (UniqueName: \"kubernetes.io/projected/890bdf76-6137-4d6f-b87c-30f5e215cd21-kube-api-access-wf86l\") pod \"manila-db-create-4mczb\" (UID: \"890bdf76-6137-4d6f-b87c-30f5e215cd21\") " pod="openstack/manila-db-create-4mczb" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.405854 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf86l\" (UniqueName: \"kubernetes.io/projected/890bdf76-6137-4d6f-b87c-30f5e215cd21-kube-api-access-wf86l\") pod \"manila-db-create-4mczb\" (UID: \"890bdf76-6137-4d6f-b87c-30f5e215cd21\") " pod="openstack/manila-db-create-4mczb" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.434729 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tncc7"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.435702 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tncc7" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.446921 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p2lq5" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.517160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6q5s\" (UniqueName: \"kubernetes.io/projected/1aa050c7-72ad-4eba-8fee-9990ca164f78-kube-api-access-k6q5s\") pod \"neutron-db-create-tncc7\" (UID: \"1aa050c7-72ad-4eba-8fee-9990ca164f78\") " pod="openstack/neutron-db-create-tncc7" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.517833 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-combined-ca-bundle\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.517613 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tncc7"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.518527 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwzbc\" (UniqueName: \"kubernetes.io/projected/5895c0e5-f156-4a12-952c-3690cf355178-kube-api-access-mwzbc\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.518567 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-config-data\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.531093 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-combined-ca-bundle\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.531600 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-config-data\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.542730 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4mczb" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.545757 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwzbc\" (UniqueName: \"kubernetes.io/projected/5895c0e5-f156-4a12-952c-3690cf355178-kube-api-access-mwzbc\") pod \"keystone-db-sync-vb2lg\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.620456 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6q5s\" (UniqueName: \"kubernetes.io/projected/1aa050c7-72ad-4eba-8fee-9990ca164f78-kube-api-access-k6q5s\") pod \"neutron-db-create-tncc7\" (UID: \"1aa050c7-72ad-4eba-8fee-9990ca164f78\") " pod="openstack/neutron-db-create-tncc7" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.647949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6q5s\" (UniqueName: \"kubernetes.io/projected/1aa050c7-72ad-4eba-8fee-9990ca164f78-kube-api-access-k6q5s\") pod \"neutron-db-create-tncc7\" (UID: \"1aa050c7-72ad-4eba-8fee-9990ca164f78\") " pod="openstack/neutron-db-create-tncc7" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.668708 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.733490 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.816803 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4zsf7"] Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.820615 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-4zsf7" podUID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerName="dnsmasq-dns" containerID="cri-o://57b925659a328f586a63e0901e2675c3ffb7a0c1ff15b9be4f3068b1e05c3ee0" gracePeriod=10 Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.850208 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tncc7" Sep 30 03:12:16 crc kubenswrapper[4744]: I0930 03:12:16.990423 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jt29c"] Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.107816 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-p2lq5"] Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.134484 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tncc7"] Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.163897 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-4mczb"] Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.174781 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jt29c" event={"ID":"8221af2a-5c5a-4b77-82da-57f20d0e50c7","Type":"ContainerStarted","Data":"2664345764d1e45247526d38ded90ccbef3130a9bbf3519aa32a15d654b9a2ed"} Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.176757 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p2lq5" event={"ID":"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e","Type":"ContainerStarted","Data":"defc0a858306b079ef0565f1db08f8e5ff11ad33678e47d327b56e9b607b04e9"} Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.179726 4744 generic.go:334] "Generic (PLEG): container finished" podID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerID="57b925659a328f586a63e0901e2675c3ffb7a0c1ff15b9be4f3068b1e05c3ee0" exitCode=0 Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.179768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4zsf7" event={"ID":"6d287118-3aa1-4314-b9fa-6bc54a28878a","Type":"ContainerDied","Data":"57b925659a328f586a63e0901e2675c3ffb7a0c1ff15b9be4f3068b1e05c3ee0"} Sep 30 03:12:17 crc kubenswrapper[4744]: W0930 03:12:17.193808 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod890bdf76_6137_4d6f_b87c_30f5e215cd21.slice/crio-3f3487a0a16f444a4be000320c7e065e22017a5be1923d77ddb64ee49b7226a9 WatchSource:0}: Error finding container 3f3487a0a16f444a4be000320c7e065e22017a5be1923d77ddb64ee49b7226a9: Status 404 returned error can't find the container with id 3f3487a0a16f444a4be000320c7e065e22017a5be1923d77ddb64ee49b7226a9 Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.386324 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vb2lg"] Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.387266 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.543732 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-sb\") pod \"6d287118-3aa1-4314-b9fa-6bc54a28878a\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.543891 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-nb\") pod \"6d287118-3aa1-4314-b9fa-6bc54a28878a\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.543940 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-config\") pod \"6d287118-3aa1-4314-b9fa-6bc54a28878a\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.543981 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-dns-svc\") pod \"6d287118-3aa1-4314-b9fa-6bc54a28878a\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.544010 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk8z4\" (UniqueName: \"kubernetes.io/projected/6d287118-3aa1-4314-b9fa-6bc54a28878a-kube-api-access-rk8z4\") pod \"6d287118-3aa1-4314-b9fa-6bc54a28878a\" (UID: \"6d287118-3aa1-4314-b9fa-6bc54a28878a\") " Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.550769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d287118-3aa1-4314-b9fa-6bc54a28878a-kube-api-access-rk8z4" (OuterVolumeSpecName: "kube-api-access-rk8z4") pod "6d287118-3aa1-4314-b9fa-6bc54a28878a" (UID: "6d287118-3aa1-4314-b9fa-6bc54a28878a"). InnerVolumeSpecName "kube-api-access-rk8z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.605478 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d287118-3aa1-4314-b9fa-6bc54a28878a" (UID: "6d287118-3aa1-4314-b9fa-6bc54a28878a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.612265 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d287118-3aa1-4314-b9fa-6bc54a28878a" (UID: "6d287118-3aa1-4314-b9fa-6bc54a28878a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.626321 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-config" (OuterVolumeSpecName: "config") pod "6d287118-3aa1-4314-b9fa-6bc54a28878a" (UID: "6d287118-3aa1-4314-b9fa-6bc54a28878a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.636152 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d287118-3aa1-4314-b9fa-6bc54a28878a" (UID: "6d287118-3aa1-4314-b9fa-6bc54a28878a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.681274 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.681299 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk8z4\" (UniqueName: \"kubernetes.io/projected/6d287118-3aa1-4314-b9fa-6bc54a28878a-kube-api-access-rk8z4\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.681311 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.681319 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:17 crc kubenswrapper[4744]: I0930 03:12:17.681346 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d287118-3aa1-4314-b9fa-6bc54a28878a-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.191701 4744 generic.go:334] "Generic (PLEG): container finished" podID="e3e9b9bc-414a-4ddf-8757-6fd3ea97306e" containerID="d4e5c065bf105d380933b3bc1a1d87e4790fdc380c41270a9f7b6c4b0c42ccfc" exitCode=0 Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.191805 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p2lq5" event={"ID":"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e","Type":"ContainerDied","Data":"d4e5c065bf105d380933b3bc1a1d87e4790fdc380c41270a9f7b6c4b0c42ccfc"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.197763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4zsf7" event={"ID":"6d287118-3aa1-4314-b9fa-6bc54a28878a","Type":"ContainerDied","Data":"083b559b629e906aeafdc72fd696b02b6532038570427b7cf1c3dbf4d7f4e1a9"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.197814 4744 scope.go:117] "RemoveContainer" containerID="57b925659a328f586a63e0901e2675c3ffb7a0c1ff15b9be4f3068b1e05c3ee0" Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.197972 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4zsf7" Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.212800 4744 generic.go:334] "Generic (PLEG): container finished" podID="1aa050c7-72ad-4eba-8fee-9990ca164f78" containerID="c1cfd7551f5035168439ba2e5fcccc50cd250d660465e48451afbd34ef80f772" exitCode=0 Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.212861 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tncc7" event={"ID":"1aa050c7-72ad-4eba-8fee-9990ca164f78","Type":"ContainerDied","Data":"c1cfd7551f5035168439ba2e5fcccc50cd250d660465e48451afbd34ef80f772"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.212887 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tncc7" event={"ID":"1aa050c7-72ad-4eba-8fee-9990ca164f78","Type":"ContainerStarted","Data":"93134725179ea5d8869091b2063e243d0e588e96c6d8c82028efd44ba4635340"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.221162 4744 generic.go:334] "Generic (PLEG): container finished" podID="890bdf76-6137-4d6f-b87c-30f5e215cd21" containerID="3ec60ef389fcbbe40a8f1440d201481ad13716fb445a5bdc46859704a772186e" exitCode=0 Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.221231 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4mczb" event={"ID":"890bdf76-6137-4d6f-b87c-30f5e215cd21","Type":"ContainerDied","Data":"3ec60ef389fcbbe40a8f1440d201481ad13716fb445a5bdc46859704a772186e"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.221258 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4mczb" event={"ID":"890bdf76-6137-4d6f-b87c-30f5e215cd21","Type":"ContainerStarted","Data":"3f3487a0a16f444a4be000320c7e065e22017a5be1923d77ddb64ee49b7226a9"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.222503 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vb2lg" event={"ID":"5895c0e5-f156-4a12-952c-3690cf355178","Type":"ContainerStarted","Data":"6b41d79a394bbb2a4de8c6b63c7fedb8b8f0b326ee545e99f90e67edcd8a51e5"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.223554 4744 generic.go:334] "Generic (PLEG): container finished" podID="8221af2a-5c5a-4b77-82da-57f20d0e50c7" containerID="64de70424d8528e88d8673100eba28fa7cda134b75760beff161eae520e4af96" exitCode=0 Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.223590 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jt29c" event={"ID":"8221af2a-5c5a-4b77-82da-57f20d0e50c7","Type":"ContainerDied","Data":"64de70424d8528e88d8673100eba28fa7cda134b75760beff161eae520e4af96"} Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.233890 4744 scope.go:117] "RemoveContainer" containerID="02a38a0031bd8a635ab3a51afe3610006095513234954d80c1e338e169d92cc0" Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.256722 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4zsf7"] Sep 30 03:12:18 crc kubenswrapper[4744]: I0930 03:12:18.271027 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4zsf7"] Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.522875 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d287118-3aa1-4314-b9fa-6bc54a28878a" path="/var/lib/kubelet/pods/6d287118-3aa1-4314-b9fa-6bc54a28878a/volumes" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.567907 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4mczb" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.669449 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jt29c" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.672869 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tncc7" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.678488 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p2lq5" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.715322 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf86l\" (UniqueName: \"kubernetes.io/projected/890bdf76-6137-4d6f-b87c-30f5e215cd21-kube-api-access-wf86l\") pod \"890bdf76-6137-4d6f-b87c-30f5e215cd21\" (UID: \"890bdf76-6137-4d6f-b87c-30f5e215cd21\") " Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.732709 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890bdf76-6137-4d6f-b87c-30f5e215cd21-kube-api-access-wf86l" (OuterVolumeSpecName: "kube-api-access-wf86l") pod "890bdf76-6137-4d6f-b87c-30f5e215cd21" (UID: "890bdf76-6137-4d6f-b87c-30f5e215cd21"). InnerVolumeSpecName "kube-api-access-wf86l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.817485 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q4bv\" (UniqueName: \"kubernetes.io/projected/e3e9b9bc-414a-4ddf-8757-6fd3ea97306e-kube-api-access-5q4bv\") pod \"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e\" (UID: \"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e\") " Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.817526 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bsd6\" (UniqueName: \"kubernetes.io/projected/8221af2a-5c5a-4b77-82da-57f20d0e50c7-kube-api-access-7bsd6\") pod \"8221af2a-5c5a-4b77-82da-57f20d0e50c7\" (UID: \"8221af2a-5c5a-4b77-82da-57f20d0e50c7\") " Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.817594 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6q5s\" (UniqueName: \"kubernetes.io/projected/1aa050c7-72ad-4eba-8fee-9990ca164f78-kube-api-access-k6q5s\") pod \"1aa050c7-72ad-4eba-8fee-9990ca164f78\" (UID: \"1aa050c7-72ad-4eba-8fee-9990ca164f78\") " Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.817979 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf86l\" (UniqueName: \"kubernetes.io/projected/890bdf76-6137-4d6f-b87c-30f5e215cd21-kube-api-access-wf86l\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.823175 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e9b9bc-414a-4ddf-8757-6fd3ea97306e-kube-api-access-5q4bv" (OuterVolumeSpecName: "kube-api-access-5q4bv") pod "e3e9b9bc-414a-4ddf-8757-6fd3ea97306e" (UID: "e3e9b9bc-414a-4ddf-8757-6fd3ea97306e"). InnerVolumeSpecName "kube-api-access-5q4bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.824350 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa050c7-72ad-4eba-8fee-9990ca164f78-kube-api-access-k6q5s" (OuterVolumeSpecName: "kube-api-access-k6q5s") pod "1aa050c7-72ad-4eba-8fee-9990ca164f78" (UID: "1aa050c7-72ad-4eba-8fee-9990ca164f78"). InnerVolumeSpecName "kube-api-access-k6q5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.826259 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8221af2a-5c5a-4b77-82da-57f20d0e50c7-kube-api-access-7bsd6" (OuterVolumeSpecName: "kube-api-access-7bsd6") pod "8221af2a-5c5a-4b77-82da-57f20d0e50c7" (UID: "8221af2a-5c5a-4b77-82da-57f20d0e50c7"). InnerVolumeSpecName "kube-api-access-7bsd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.919754 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q4bv\" (UniqueName: \"kubernetes.io/projected/e3e9b9bc-414a-4ddf-8757-6fd3ea97306e-kube-api-access-5q4bv\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.919782 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bsd6\" (UniqueName: \"kubernetes.io/projected/8221af2a-5c5a-4b77-82da-57f20d0e50c7-kube-api-access-7bsd6\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:19 crc kubenswrapper[4744]: I0930 03:12:19.919793 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6q5s\" (UniqueName: \"kubernetes.io/projected/1aa050c7-72ad-4eba-8fee-9990ca164f78-kube-api-access-k6q5s\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.241966 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p2lq5" event={"ID":"e3e9b9bc-414a-4ddf-8757-6fd3ea97306e","Type":"ContainerDied","Data":"defc0a858306b079ef0565f1db08f8e5ff11ad33678e47d327b56e9b607b04e9"} Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.242231 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="defc0a858306b079ef0565f1db08f8e5ff11ad33678e47d327b56e9b607b04e9" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.242020 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p2lq5" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.243566 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tncc7" event={"ID":"1aa050c7-72ad-4eba-8fee-9990ca164f78","Type":"ContainerDied","Data":"93134725179ea5d8869091b2063e243d0e588e96c6d8c82028efd44ba4635340"} Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.243586 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93134725179ea5d8869091b2063e243d0e588e96c6d8c82028efd44ba4635340" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.243609 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tncc7" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.245662 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4mczb" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.245655 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4mczb" event={"ID":"890bdf76-6137-4d6f-b87c-30f5e215cd21","Type":"ContainerDied","Data":"3f3487a0a16f444a4be000320c7e065e22017a5be1923d77ddb64ee49b7226a9"} Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.245779 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f3487a0a16f444a4be000320c7e065e22017a5be1923d77ddb64ee49b7226a9" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.246874 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jt29c" event={"ID":"8221af2a-5c5a-4b77-82da-57f20d0e50c7","Type":"ContainerDied","Data":"2664345764d1e45247526d38ded90ccbef3130a9bbf3519aa32a15d654b9a2ed"} Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.246899 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2664345764d1e45247526d38ded90ccbef3130a9bbf3519aa32a15d654b9a2ed" Sep 30 03:12:20 crc kubenswrapper[4744]: I0930 03:12:20.246917 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jt29c" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.086972 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-21d3-account-create-w8rr5"] Sep 30 03:12:26 crc kubenswrapper[4744]: E0930 03:12:26.087580 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e9b9bc-414a-4ddf-8757-6fd3ea97306e" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087593 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e9b9bc-414a-4ddf-8757-6fd3ea97306e" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: E0930 03:12:26.087602 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221af2a-5c5a-4b77-82da-57f20d0e50c7" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087608 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221af2a-5c5a-4b77-82da-57f20d0e50c7" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: E0930 03:12:26.087623 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerName="dnsmasq-dns" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087630 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerName="dnsmasq-dns" Sep 30 03:12:26 crc kubenswrapper[4744]: E0930 03:12:26.087639 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890bdf76-6137-4d6f-b87c-30f5e215cd21" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087645 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="890bdf76-6137-4d6f-b87c-30f5e215cd21" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: E0930 03:12:26.087656 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerName="init" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087661 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerName="init" Sep 30 03:12:26 crc kubenswrapper[4744]: E0930 03:12:26.087676 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa050c7-72ad-4eba-8fee-9990ca164f78" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087682 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa050c7-72ad-4eba-8fee-9990ca164f78" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087819 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8221af2a-5c5a-4b77-82da-57f20d0e50c7" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087829 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d287118-3aa1-4314-b9fa-6bc54a28878a" containerName="dnsmasq-dns" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087840 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e9b9bc-414a-4ddf-8757-6fd3ea97306e" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087850 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="890bdf76-6137-4d6f-b87c-30f5e215cd21" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.087865 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa050c7-72ad-4eba-8fee-9990ca164f78" containerName="mariadb-database-create" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.088347 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21d3-account-create-w8rr5" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.095455 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.101084 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-21d3-account-create-w8rr5"] Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.168959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m9kd\" (UniqueName: \"kubernetes.io/projected/3adb9ba0-b837-4ba5-9fa9-faa07f3dc718-kube-api-access-5m9kd\") pod \"barbican-21d3-account-create-w8rr5\" (UID: \"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718\") " pod="openstack/barbican-21d3-account-create-w8rr5" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.180073 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86ab-account-create-7bn4b"] Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.181270 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86ab-account-create-7bn4b" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.185731 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.187985 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86ab-account-create-7bn4b"] Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.270165 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2blm9\" (UniqueName: \"kubernetes.io/projected/26cadf80-2d75-4df3-a182-17078060bd12-kube-api-access-2blm9\") pod \"cinder-86ab-account-create-7bn4b\" (UID: \"26cadf80-2d75-4df3-a182-17078060bd12\") " pod="openstack/cinder-86ab-account-create-7bn4b" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.270252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m9kd\" (UniqueName: \"kubernetes.io/projected/3adb9ba0-b837-4ba5-9fa9-faa07f3dc718-kube-api-access-5m9kd\") pod \"barbican-21d3-account-create-w8rr5\" (UID: \"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718\") " pod="openstack/barbican-21d3-account-create-w8rr5" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.311500 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m9kd\" (UniqueName: \"kubernetes.io/projected/3adb9ba0-b837-4ba5-9fa9-faa07f3dc718-kube-api-access-5m9kd\") pod \"barbican-21d3-account-create-w8rr5\" (UID: \"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718\") " pod="openstack/barbican-21d3-account-create-w8rr5" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.372712 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2blm9\" (UniqueName: \"kubernetes.io/projected/26cadf80-2d75-4df3-a182-17078060bd12-kube-api-access-2blm9\") pod \"cinder-86ab-account-create-7bn4b\" (UID: \"26cadf80-2d75-4df3-a182-17078060bd12\") " pod="openstack/cinder-86ab-account-create-7bn4b" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.384936 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-b39b-account-create-vdd2g"] Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.388312 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b39b-account-create-vdd2g" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.393026 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.394639 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2blm9\" (UniqueName: \"kubernetes.io/projected/26cadf80-2d75-4df3-a182-17078060bd12-kube-api-access-2blm9\") pod \"cinder-86ab-account-create-7bn4b\" (UID: \"26cadf80-2d75-4df3-a182-17078060bd12\") " pod="openstack/cinder-86ab-account-create-7bn4b" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.403413 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b39b-account-create-vdd2g"] Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.412697 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21d3-account-create-w8rr5" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.500518 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86ab-account-create-7bn4b" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.576213 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf5zm\" (UniqueName: \"kubernetes.io/projected/1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea-kube-api-access-pf5zm\") pod \"manila-b39b-account-create-vdd2g\" (UID: \"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea\") " pod="openstack/manila-b39b-account-create-vdd2g" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.577700 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9eca-account-create-n9msd"] Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.578982 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9eca-account-create-n9msd" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.580586 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.585510 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9eca-account-create-n9msd"] Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.677473 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf5zm\" (UniqueName: \"kubernetes.io/projected/1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea-kube-api-access-pf5zm\") pod \"manila-b39b-account-create-vdd2g\" (UID: \"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea\") " pod="openstack/manila-b39b-account-create-vdd2g" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.677636 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqnr6\" (UniqueName: \"kubernetes.io/projected/11f58173-fb96-4909-ae89-9648dddcedf0-kube-api-access-gqnr6\") pod \"neutron-9eca-account-create-n9msd\" (UID: \"11f58173-fb96-4909-ae89-9648dddcedf0\") " pod="openstack/neutron-9eca-account-create-n9msd" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.694441 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf5zm\" (UniqueName: \"kubernetes.io/projected/1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea-kube-api-access-pf5zm\") pod \"manila-b39b-account-create-vdd2g\" (UID: \"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea\") " pod="openstack/manila-b39b-account-create-vdd2g" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.736503 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b39b-account-create-vdd2g" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.780436 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqnr6\" (UniqueName: \"kubernetes.io/projected/11f58173-fb96-4909-ae89-9648dddcedf0-kube-api-access-gqnr6\") pod \"neutron-9eca-account-create-n9msd\" (UID: \"11f58173-fb96-4909-ae89-9648dddcedf0\") " pod="openstack/neutron-9eca-account-create-n9msd" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.797345 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqnr6\" (UniqueName: \"kubernetes.io/projected/11f58173-fb96-4909-ae89-9648dddcedf0-kube-api-access-gqnr6\") pod \"neutron-9eca-account-create-n9msd\" (UID: \"11f58173-fb96-4909-ae89-9648dddcedf0\") " pod="openstack/neutron-9eca-account-create-n9msd" Sep 30 03:12:26 crc kubenswrapper[4744]: I0930 03:12:26.908513 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9eca-account-create-n9msd" Sep 30 03:12:29 crc kubenswrapper[4744]: I0930 03:12:29.653142 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-21d3-account-create-w8rr5"] Sep 30 03:12:29 crc kubenswrapper[4744]: I0930 03:12:29.774710 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b39b-account-create-vdd2g"] Sep 30 03:12:29 crc kubenswrapper[4744]: W0930 03:12:29.782176 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c1aaa7c_986e_4b9e_b8a2_fc913a6230ea.slice/crio-f6139d007809f78fab5b1edcbe22b78bd650a04c4baee7121cdf7472c47ad150 WatchSource:0}: Error finding container f6139d007809f78fab5b1edcbe22b78bd650a04c4baee7121cdf7472c47ad150: Status 404 returned error can't find the container with id f6139d007809f78fab5b1edcbe22b78bd650a04c4baee7121cdf7472c47ad150 Sep 30 03:12:29 crc kubenswrapper[4744]: I0930 03:12:29.783482 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9eca-account-create-n9msd"] Sep 30 03:12:29 crc kubenswrapper[4744]: I0930 03:12:29.792464 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86ab-account-create-7bn4b"] Sep 30 03:12:30 crc kubenswrapper[4744]: E0930 03:12:30.072030 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3adb9ba0_b837_4ba5_9fa9_faa07f3dc718.slice/crio-15dcb2a2dea1e11b7f480a9931abdb951390c946451a21ed5ea7cad5f455e442.scope\": RecentStats: unable to find data in memory cache]" Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.375221 4744 generic.go:334] "Generic (PLEG): container finished" podID="11f58173-fb96-4909-ae89-9648dddcedf0" containerID="49000545f8bd30abbc27e44a25e927d55cab885e11a61196be3c674b05913650" exitCode=0 Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.375306 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9eca-account-create-n9msd" event={"ID":"11f58173-fb96-4909-ae89-9648dddcedf0","Type":"ContainerDied","Data":"49000545f8bd30abbc27e44a25e927d55cab885e11a61196be3c674b05913650"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.375746 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9eca-account-create-n9msd" event={"ID":"11f58173-fb96-4909-ae89-9648dddcedf0","Type":"ContainerStarted","Data":"7512476a2189c052c3bb2eda0839f252b56e371d96d1e13a5b893fab74cd36f6"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.379679 4744 generic.go:334] "Generic (PLEG): container finished" podID="3adb9ba0-b837-4ba5-9fa9-faa07f3dc718" containerID="15dcb2a2dea1e11b7f480a9931abdb951390c946451a21ed5ea7cad5f455e442" exitCode=0 Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.379813 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-21d3-account-create-w8rr5" event={"ID":"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718","Type":"ContainerDied","Data":"15dcb2a2dea1e11b7f480a9931abdb951390c946451a21ed5ea7cad5f455e442"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.379844 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-21d3-account-create-w8rr5" event={"ID":"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718","Type":"ContainerStarted","Data":"419eb6ec6677a30f79365c2776a5d00fbaa292601eb46fb7745c43444a5e1951"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.382957 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rzbkf" event={"ID":"dab1acfa-0313-4621-9d6e-6ab34807d0e5","Type":"ContainerStarted","Data":"736e67b520621626ade8ae88a72df0e7390e168b4daadd51e756193ff9493e52"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.386260 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vb2lg" event={"ID":"5895c0e5-f156-4a12-952c-3690cf355178","Type":"ContainerStarted","Data":"f8cb28a160438fd03a3d29cb6842ae9be12b992183f86c0ea5d6df3ba8b6ca41"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.389625 4744 generic.go:334] "Generic (PLEG): container finished" podID="26cadf80-2d75-4df3-a182-17078060bd12" containerID="118332e00d4481fed966331aa6b3917ade1485cbc92c6fad7636f4273e0372e6" exitCode=0 Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.389718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86ab-account-create-7bn4b" event={"ID":"26cadf80-2d75-4df3-a182-17078060bd12","Type":"ContainerDied","Data":"118332e00d4481fed966331aa6b3917ade1485cbc92c6fad7636f4273e0372e6"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.389747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86ab-account-create-7bn4b" event={"ID":"26cadf80-2d75-4df3-a182-17078060bd12","Type":"ContainerStarted","Data":"1bdfd96583450f66555bf5123049b4c3c47f22bc6a69ddaaf70e918c85b451e0"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.392633 4744 generic.go:334] "Generic (PLEG): container finished" podID="1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea" containerID="b076b97a7f60a0a3d96488ba33d74bb941f97db0a60f0e614cf1d8f67e80394a" exitCode=0 Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.392676 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b39b-account-create-vdd2g" event={"ID":"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea","Type":"ContainerDied","Data":"b076b97a7f60a0a3d96488ba33d74bb941f97db0a60f0e614cf1d8f67e80394a"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.393007 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b39b-account-create-vdd2g" event={"ID":"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea","Type":"ContainerStarted","Data":"f6139d007809f78fab5b1edcbe22b78bd650a04c4baee7121cdf7472c47ad150"} Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.457344 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rzbkf" podStartSLOduration=3.079960988 podStartE2EDuration="18.457317365s" podCreationTimestamp="2025-09-30 03:12:12 +0000 UTC" firstStartedPulling="2025-09-30 03:12:13.92870685 +0000 UTC m=+1061.101926824" lastFinishedPulling="2025-09-30 03:12:29.306063217 +0000 UTC m=+1076.479283201" observedRunningTime="2025-09-30 03:12:30.447160231 +0000 UTC m=+1077.620380245" watchObservedRunningTime="2025-09-30 03:12:30.457317365 +0000 UTC m=+1077.630537369" Sep 30 03:12:30 crc kubenswrapper[4744]: I0930 03:12:30.517658 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vb2lg" podStartSLOduration=2.631624473 podStartE2EDuration="14.517633207s" podCreationTimestamp="2025-09-30 03:12:16 +0000 UTC" firstStartedPulling="2025-09-30 03:12:17.404473299 +0000 UTC m=+1064.577693273" lastFinishedPulling="2025-09-30 03:12:29.290481993 +0000 UTC m=+1076.463702007" observedRunningTime="2025-09-30 03:12:30.517186244 +0000 UTC m=+1077.690406288" watchObservedRunningTime="2025-09-30 03:12:30.517633207 +0000 UTC m=+1077.690853211" Sep 30 03:12:31 crc kubenswrapper[4744]: I0930 03:12:31.895700 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86ab-account-create-7bn4b" Sep 30 03:12:31 crc kubenswrapper[4744]: I0930 03:12:31.900419 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21d3-account-create-w8rr5" Sep 30 03:12:31 crc kubenswrapper[4744]: I0930 03:12:31.910443 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9eca-account-create-n9msd" Sep 30 03:12:31 crc kubenswrapper[4744]: I0930 03:12:31.922027 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b39b-account-create-vdd2g" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.072862 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqnr6\" (UniqueName: \"kubernetes.io/projected/11f58173-fb96-4909-ae89-9648dddcedf0-kube-api-access-gqnr6\") pod \"11f58173-fb96-4909-ae89-9648dddcedf0\" (UID: \"11f58173-fb96-4909-ae89-9648dddcedf0\") " Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.073016 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m9kd\" (UniqueName: \"kubernetes.io/projected/3adb9ba0-b837-4ba5-9fa9-faa07f3dc718-kube-api-access-5m9kd\") pod \"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718\" (UID: \"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718\") " Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.073073 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf5zm\" (UniqueName: \"kubernetes.io/projected/1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea-kube-api-access-pf5zm\") pod \"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea\" (UID: \"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea\") " Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.073136 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2blm9\" (UniqueName: \"kubernetes.io/projected/26cadf80-2d75-4df3-a182-17078060bd12-kube-api-access-2blm9\") pod \"26cadf80-2d75-4df3-a182-17078060bd12\" (UID: \"26cadf80-2d75-4df3-a182-17078060bd12\") " Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.079040 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26cadf80-2d75-4df3-a182-17078060bd12-kube-api-access-2blm9" (OuterVolumeSpecName: "kube-api-access-2blm9") pod "26cadf80-2d75-4df3-a182-17078060bd12" (UID: "26cadf80-2d75-4df3-a182-17078060bd12"). InnerVolumeSpecName "kube-api-access-2blm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.079284 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea-kube-api-access-pf5zm" (OuterVolumeSpecName: "kube-api-access-pf5zm") pod "1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea" (UID: "1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea"). InnerVolumeSpecName "kube-api-access-pf5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.079730 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f58173-fb96-4909-ae89-9648dddcedf0-kube-api-access-gqnr6" (OuterVolumeSpecName: "kube-api-access-gqnr6") pod "11f58173-fb96-4909-ae89-9648dddcedf0" (UID: "11f58173-fb96-4909-ae89-9648dddcedf0"). InnerVolumeSpecName "kube-api-access-gqnr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.080451 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3adb9ba0-b837-4ba5-9fa9-faa07f3dc718-kube-api-access-5m9kd" (OuterVolumeSpecName: "kube-api-access-5m9kd") pod "3adb9ba0-b837-4ba5-9fa9-faa07f3dc718" (UID: "3adb9ba0-b837-4ba5-9fa9-faa07f3dc718"). InnerVolumeSpecName "kube-api-access-5m9kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.175688 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqnr6\" (UniqueName: \"kubernetes.io/projected/11f58173-fb96-4909-ae89-9648dddcedf0-kube-api-access-gqnr6\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.175738 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m9kd\" (UniqueName: \"kubernetes.io/projected/3adb9ba0-b837-4ba5-9fa9-faa07f3dc718-kube-api-access-5m9kd\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.175758 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf5zm\" (UniqueName: \"kubernetes.io/projected/1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea-kube-api-access-pf5zm\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.175776 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2blm9\" (UniqueName: \"kubernetes.io/projected/26cadf80-2d75-4df3-a182-17078060bd12-kube-api-access-2blm9\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.414683 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86ab-account-create-7bn4b" event={"ID":"26cadf80-2d75-4df3-a182-17078060bd12","Type":"ContainerDied","Data":"1bdfd96583450f66555bf5123049b4c3c47f22bc6a69ddaaf70e918c85b451e0"} Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.416048 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdfd96583450f66555bf5123049b4c3c47f22bc6a69ddaaf70e918c85b451e0" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.414702 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86ab-account-create-7bn4b" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.417215 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b39b-account-create-vdd2g" event={"ID":"1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea","Type":"ContainerDied","Data":"f6139d007809f78fab5b1edcbe22b78bd650a04c4baee7121cdf7472c47ad150"} Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.417274 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6139d007809f78fab5b1edcbe22b78bd650a04c4baee7121cdf7472c47ad150" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.417297 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b39b-account-create-vdd2g" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.422246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-21d3-account-create-w8rr5" event={"ID":"3adb9ba0-b837-4ba5-9fa9-faa07f3dc718","Type":"ContainerDied","Data":"419eb6ec6677a30f79365c2776a5d00fbaa292601eb46fb7745c43444a5e1951"} Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.422281 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-21d3-account-create-w8rr5" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.422294 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419eb6ec6677a30f79365c2776a5d00fbaa292601eb46fb7745c43444a5e1951" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.424598 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9eca-account-create-n9msd" event={"ID":"11f58173-fb96-4909-ae89-9648dddcedf0","Type":"ContainerDied","Data":"7512476a2189c052c3bb2eda0839f252b56e371d96d1e13a5b893fab74cd36f6"} Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.424643 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7512476a2189c052c3bb2eda0839f252b56e371d96d1e13a5b893fab74cd36f6" Sep 30 03:12:32 crc kubenswrapper[4744]: I0930 03:12:32.424717 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9eca-account-create-n9msd" Sep 30 03:12:33 crc kubenswrapper[4744]: I0930 03:12:33.434160 4744 generic.go:334] "Generic (PLEG): container finished" podID="5895c0e5-f156-4a12-952c-3690cf355178" containerID="f8cb28a160438fd03a3d29cb6842ae9be12b992183f86c0ea5d6df3ba8b6ca41" exitCode=0 Sep 30 03:12:33 crc kubenswrapper[4744]: I0930 03:12:33.434216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vb2lg" event={"ID":"5895c0e5-f156-4a12-952c-3690cf355178","Type":"ContainerDied","Data":"f8cb28a160438fd03a3d29cb6842ae9be12b992183f86c0ea5d6df3ba8b6ca41"} Sep 30 03:12:34 crc kubenswrapper[4744]: I0930 03:12:34.348002 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:12:34 crc kubenswrapper[4744]: I0930 03:12:34.348513 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:12:34 crc kubenswrapper[4744]: I0930 03:12:34.838200 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.025479 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-combined-ca-bundle\") pod \"5895c0e5-f156-4a12-952c-3690cf355178\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.025851 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwzbc\" (UniqueName: \"kubernetes.io/projected/5895c0e5-f156-4a12-952c-3690cf355178-kube-api-access-mwzbc\") pod \"5895c0e5-f156-4a12-952c-3690cf355178\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.026004 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-config-data\") pod \"5895c0e5-f156-4a12-952c-3690cf355178\" (UID: \"5895c0e5-f156-4a12-952c-3690cf355178\") " Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.040589 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5895c0e5-f156-4a12-952c-3690cf355178-kube-api-access-mwzbc" (OuterVolumeSpecName: "kube-api-access-mwzbc") pod "5895c0e5-f156-4a12-952c-3690cf355178" (UID: "5895c0e5-f156-4a12-952c-3690cf355178"). InnerVolumeSpecName "kube-api-access-mwzbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.069211 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5895c0e5-f156-4a12-952c-3690cf355178" (UID: "5895c0e5-f156-4a12-952c-3690cf355178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.101538 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-config-data" (OuterVolumeSpecName: "config-data") pod "5895c0e5-f156-4a12-952c-3690cf355178" (UID: "5895c0e5-f156-4a12-952c-3690cf355178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.128411 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwzbc\" (UniqueName: \"kubernetes.io/projected/5895c0e5-f156-4a12-952c-3690cf355178-kube-api-access-mwzbc\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.128446 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.128458 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5895c0e5-f156-4a12-952c-3690cf355178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.465979 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vb2lg" event={"ID":"5895c0e5-f156-4a12-952c-3690cf355178","Type":"ContainerDied","Data":"6b41d79a394bbb2a4de8c6b63c7fedb8b8f0b326ee545e99f90e67edcd8a51e5"} Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.466044 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b41d79a394bbb2a4de8c6b63c7fedb8b8f0b326ee545e99f90e67edcd8a51e5" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.466070 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vb2lg" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.740631 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w4665"] Sep 30 03:12:35 crc kubenswrapper[4744]: E0930 03:12:35.741298 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f58173-fb96-4909-ae89-9648dddcedf0" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741318 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f58173-fb96-4909-ae89-9648dddcedf0" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: E0930 03:12:35.741341 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26cadf80-2d75-4df3-a182-17078060bd12" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741347 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="26cadf80-2d75-4df3-a182-17078060bd12" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: E0930 03:12:35.741361 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5895c0e5-f156-4a12-952c-3690cf355178" containerName="keystone-db-sync" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741397 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="5895c0e5-f156-4a12-952c-3690cf355178" containerName="keystone-db-sync" Sep 30 03:12:35 crc kubenswrapper[4744]: E0930 03:12:35.741413 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741420 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: E0930 03:12:35.741430 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adb9ba0-b837-4ba5-9fa9-faa07f3dc718" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741436 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adb9ba0-b837-4ba5-9fa9-faa07f3dc718" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741612 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f58173-fb96-4909-ae89-9648dddcedf0" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741622 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3adb9ba0-b837-4ba5-9fa9-faa07f3dc718" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741640 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="26cadf80-2d75-4df3-a182-17078060bd12" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741653 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="5895c0e5-f156-4a12-952c-3690cf355178" containerName="keystone-db-sync" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.741661 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea" containerName="mariadb-account-create" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.742254 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.745024 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.745213 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.745320 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.748634 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g28pb" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.752654 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wq8th"] Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.754104 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.763777 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w4665"] Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.773870 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wq8th"] Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840601 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840656 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-credential-keys\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840689 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-scripts\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840718 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-config-data\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840741 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-fernet-keys\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840779 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8h7n\" (UniqueName: \"kubernetes.io/projected/38f32dab-478a-48a5-bdce-066bf22d4367-kube-api-access-p8h7n\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840797 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphxf\" (UniqueName: \"kubernetes.io/projected/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-kube-api-access-wphxf\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840821 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840864 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-config\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840884 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840908 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-combined-ca-bundle\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.840933 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-svc\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.855884 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-758dd45f85-fgxnt"] Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.857143 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.862733 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.863031 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-x6q24" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.863151 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.863630 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.884159 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758dd45f85-fgxnt"] Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949752 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949834 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-scripts\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949856 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-combined-ca-bundle\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949882 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-svc\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949907 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66d32c30-b69c-4637-97a9-c1112b954a92-horizon-secret-key\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949923 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d32c30-b69c-4637-97a9-c1112b954a92-logs\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949968 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.949986 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-config-data\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950015 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-credential-keys\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-scripts\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-config-data\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950102 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-fernet-keys\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950124 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8h7n\" (UniqueName: \"kubernetes.io/projected/38f32dab-478a-48a5-bdce-066bf22d4367-kube-api-access-p8h7n\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950143 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphxf\" (UniqueName: \"kubernetes.io/projected/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-kube-api-access-wphxf\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950167 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950194 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59wrc\" (UniqueName: \"kubernetes.io/projected/66d32c30-b69c-4637-97a9-c1112b954a92-kube-api-access-59wrc\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950211 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-config\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.950971 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-config\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.951535 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.951839 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.953147 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-svc\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.953252 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.955223 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-config-data\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.955806 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-796557ff95-kphjm"] Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.956462 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-fernet-keys\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.957010 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.957979 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-scripts\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.972054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-combined-ca-bundle\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.973854 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-credential-keys\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.983262 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-796557ff95-kphjm"] Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.983533 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphxf\" (UniqueName: \"kubernetes.io/projected/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-kube-api-access-wphxf\") pod \"keystone-bootstrap-w4665\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:35 crc kubenswrapper[4744]: I0930 03:12:35.985239 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8h7n\" (UniqueName: \"kubernetes.io/projected/38f32dab-478a-48a5-bdce-066bf22d4367-kube-api-access-p8h7n\") pod \"dnsmasq-dns-55fff446b9-wq8th\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.047782 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.052293 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rmvg\" (UniqueName: \"kubernetes.io/projected/aa042742-a24d-4cf6-aecf-20b41b3287b4-kube-api-access-6rmvg\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.053050 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.056567 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.056703 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.057510 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-scripts\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.057576 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-config-data\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.057733 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa042742-a24d-4cf6-aecf-20b41b3287b4-logs\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.057869 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59wrc\" (UniqueName: \"kubernetes.io/projected/66d32c30-b69c-4637-97a9-c1112b954a92-kube-api-access-59wrc\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.057938 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-scripts\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.057985 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa042742-a24d-4cf6-aecf-20b41b3287b4-horizon-secret-key\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.058026 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66d32c30-b69c-4637-97a9-c1112b954a92-horizon-secret-key\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.058043 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d32c30-b69c-4637-97a9-c1112b954a92-logs\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.058112 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-config-data\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.059343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-scripts\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.059807 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-config-data\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.062272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d32c30-b69c-4637-97a9-c1112b954a92-logs\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.065551 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.078024 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.082899 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.083594 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66d32c30-b69c-4637-97a9-c1112b954a92-horizon-secret-key\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.093653 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59wrc\" (UniqueName: \"kubernetes.io/projected/66d32c30-b69c-4637-97a9-c1112b954a92-kube-api-access-59wrc\") pod \"horizon-758dd45f85-fgxnt\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.123718 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dpxjq"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.124833 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.131211 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wq8th"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.152017 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8z9nj" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.152047 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.152231 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159275 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-config-data\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159333 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj846\" (UniqueName: \"kubernetes.io/projected/c56c5a65-d4fe-4772-ba30-eae95674c422-kube-api-access-kj846\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159356 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159390 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rmvg\" (UniqueName: \"kubernetes.io/projected/aa042742-a24d-4cf6-aecf-20b41b3287b4-kube-api-access-6rmvg\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159420 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-scripts\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159438 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-run-httpd\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159458 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-config-data\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159492 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-log-httpd\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159522 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa042742-a24d-4cf6-aecf-20b41b3287b4-logs\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159546 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa042742-a24d-4cf6-aecf-20b41b3287b4-horizon-secret-key\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.159605 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-scripts\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.160195 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa042742-a24d-4cf6-aecf-20b41b3287b4-logs\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.160277 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-scripts\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.161084 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-config-data\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.164585 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dpxjq"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.180638 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.195915 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa042742-a24d-4cf6-aecf-20b41b3287b4-horizon-secret-key\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.210017 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qhm9q"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.215465 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rmvg\" (UniqueName: \"kubernetes.io/projected/aa042742-a24d-4cf6-aecf-20b41b3287b4-kube-api-access-6rmvg\") pod \"horizon-796557ff95-kphjm\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263430 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-log-httpd\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263487 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d629c05-1300-4fb5-8f08-211a133fffe8-logs\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263591 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-combined-ca-bundle\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263640 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-scripts\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263664 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-config-data\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263844 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-config-data\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263926 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj846\" (UniqueName: \"kubernetes.io/projected/c56c5a65-d4fe-4772-ba30-eae95674c422-kube-api-access-kj846\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.263966 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.264048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-run-httpd\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.264080 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lft7\" (UniqueName: \"kubernetes.io/projected/6d629c05-1300-4fb5-8f08-211a133fffe8-kube-api-access-8lft7\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.264125 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-scripts\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.264146 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-log-httpd\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.264930 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-run-httpd\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.267593 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.267647 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.268650 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-scripts\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.280295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-config-data\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.282912 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj846\" (UniqueName: \"kubernetes.io/projected/c56c5a65-d4fe-4772-ba30-eae95674c422-kube-api-access-kj846\") pod \"ceilometer-0\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.288487 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.314937 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qhm9q"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.345892 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.374277 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lft7\" (UniqueName: \"kubernetes.io/projected/6d629c05-1300-4fb5-8f08-211a133fffe8-kube-api-access-8lft7\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.374328 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-scripts\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.374399 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d629c05-1300-4fb5-8f08-211a133fffe8-logs\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.374498 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-combined-ca-bundle\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.374579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-config-data\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.375624 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d629c05-1300-4fb5-8f08-211a133fffe8-logs\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.378438 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-scripts\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.379873 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-combined-ca-bundle\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.380691 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-config-data\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.396210 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lft7\" (UniqueName: \"kubernetes.io/projected/6d629c05-1300-4fb5-8f08-211a133fffe8-kube-api-access-8lft7\") pod \"placement-db-sync-dpxjq\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.475585 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.475899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.475945 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.475978 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-config\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.475994 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7rhh\" (UniqueName: \"kubernetes.io/projected/982a253c-4987-43f1-896c-1ce2fa503826-kube-api-access-p7rhh\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.476018 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.482744 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.491701 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-75phb"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.493826 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.496875 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dpxjq" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.499669 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-75phb"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.501578 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xn7nl" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.501840 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.577160 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.577243 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-config\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.577264 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7rhh\" (UniqueName: \"kubernetes.io/projected/982a253c-4987-43f1-896c-1ce2fa503826-kube-api-access-p7rhh\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.577305 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.577810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.577835 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.578092 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.578268 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.578854 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-config\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.578894 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.578895 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.584127 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bvn26"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.585227 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.593044 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-j5xfr" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.593249 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.593502 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.596953 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7rhh\" (UniqueName: \"kubernetes.io/projected/982a253c-4987-43f1-896c-1ce2fa503826-kube-api-access-p7rhh\") pod \"dnsmasq-dns-76fcf4b695-qhm9q\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.600731 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bvn26"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.634074 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.655709 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w4665"] Sep 30 03:12:36 crc kubenswrapper[4744]: W0930 03:12:36.676626 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69fa47c0_aec4_4cf8_9882_1e856ebfaaf8.slice/crio-7722736858fbf776b849db84dd42301838b090460cebd427647c8779359dd2e0 WatchSource:0}: Error finding container 7722736858fbf776b849db84dd42301838b090460cebd427647c8779359dd2e0: Status 404 returned error can't find the container with id 7722736858fbf776b849db84dd42301838b090460cebd427647c8779359dd2e0 Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.679758 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-db-sync-config-data\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.679819 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-combined-ca-bundle\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.679866 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-kube-api-access-bnrg4\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.730334 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wq8th"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781405 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-combined-ca-bundle\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781460 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-scripts\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781483 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b19763-eb29-45ca-9431-8791543dee83-etc-machine-id\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fcj\" (UniqueName: \"kubernetes.io/projected/72b19763-eb29-45ca-9431-8791543dee83-kube-api-access-h2fcj\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781536 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-db-sync-config-data\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-config-data\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781593 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-combined-ca-bundle\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781629 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-kube-api-access-bnrg4\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.781657 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-db-sync-config-data\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.787591 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-pwqjw"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.788172 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-combined-ca-bundle\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.788744 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.790690 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-ljrm7" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.792830 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-db-sync-config-data\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.794336 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.795398 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-pwqjw"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.810727 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-kube-api-access-bnrg4\") pod \"barbican-db-sync-75phb\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.819591 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-75phb" Sep 30 03:12:36 crc kubenswrapper[4744]: W0930 03:12:36.845098 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66d32c30_b69c_4637_97a9_c1112b954a92.slice/crio-299c19b6853a008074c6f722837c470a495d6d2854505b1e154d1de45e813530 WatchSource:0}: Error finding container 299c19b6853a008074c6f722837c470a495d6d2854505b1e154d1de45e813530: Status 404 returned error can't find the container with id 299c19b6853a008074c6f722837c470a495d6d2854505b1e154d1de45e813530 Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.853227 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758dd45f85-fgxnt"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.877954 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-5rqpd"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.879638 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.885324 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vvvwl" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.885449 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.885634 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.888094 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5rqpd"] Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889185 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4skk\" (UniqueName: \"kubernetes.io/projected/a24c42a2-4afa-4c32-ba87-18251fd1345a-kube-api-access-q4skk\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889236 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-scripts\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889266 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b19763-eb29-45ca-9431-8791543dee83-etc-machine-id\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889295 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-config-data\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fcj\" (UniqueName: \"kubernetes.io/projected/72b19763-eb29-45ca-9431-8791543dee83-kube-api-access-h2fcj\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889395 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-config-data\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889473 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-db-sync-config-data\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889493 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-job-config-data\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889529 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-combined-ca-bundle\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:36 crc kubenswrapper[4744]: I0930 03:12:36.889560 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-combined-ca-bundle\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.894159 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b19763-eb29-45ca-9431-8791543dee83-etc-machine-id\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.895182 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-combined-ca-bundle\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.896769 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-db-sync-config-data\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.897387 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-scripts\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.900400 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-config-data\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.917311 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-796557ff95-kphjm"] Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.946155 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fcj\" (UniqueName: \"kubernetes.io/projected/72b19763-eb29-45ca-9431-8791543dee83-kube-api-access-h2fcj\") pod \"cinder-db-sync-bvn26\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.991265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-job-config-data\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.991516 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-combined-ca-bundle\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.991552 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4skk\" (UniqueName: \"kubernetes.io/projected/a24c42a2-4afa-4c32-ba87-18251fd1345a-kube-api-access-q4skk\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.991589 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-combined-ca-bundle\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.991608 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-config-data\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.991624 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-config\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.991701 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwp4m\" (UniqueName: \"kubernetes.io/projected/b10ecb07-4d75-4842-a753-f76c3a1d3b62-kube-api-access-vwp4m\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.997262 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-job-config-data\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.997792 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-combined-ca-bundle\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:36.998928 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-config-data\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.008600 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4skk\" (UniqueName: \"kubernetes.io/projected/a24c42a2-4afa-4c32-ba87-18251fd1345a-kube-api-access-q4skk\") pod \"manila-db-sync-pwqjw\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.053979 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:12:37 crc kubenswrapper[4744]: W0930 03:12:37.065559 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc56c5a65_d4fe_4772_ba30_eae95674c422.slice/crio-8306c8a75be436fa60369da62fb6f3bf76195618f96a45456fd6e25e5a834ac4 WatchSource:0}: Error finding container 8306c8a75be436fa60369da62fb6f3bf76195618f96a45456fd6e25e5a834ac4: Status 404 returned error can't find the container with id 8306c8a75be436fa60369da62fb6f3bf76195618f96a45456fd6e25e5a834ac4 Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.093012 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwp4m\" (UniqueName: \"kubernetes.io/projected/b10ecb07-4d75-4842-a753-f76c3a1d3b62-kube-api-access-vwp4m\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.093139 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-combined-ca-bundle\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.093180 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-config\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.097950 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-combined-ca-bundle\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.098487 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-config\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.116080 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwp4m\" (UniqueName: \"kubernetes.io/projected/b10ecb07-4d75-4842-a753-f76c3a1d3b62-kube-api-access-vwp4m\") pod \"neutron-db-sync-5rqpd\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: W0930 03:12:37.120868 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d629c05_1300_4fb5_8f08_211a133fffe8.slice/crio-ade53214afd893e70d03b78d08eee4e09d58d22efbfdd4d4e2ff34eae53284ec WatchSource:0}: Error finding container ade53214afd893e70d03b78d08eee4e09d58d22efbfdd4d4e2ff34eae53284ec: Status 404 returned error can't find the container with id ade53214afd893e70d03b78d08eee4e09d58d22efbfdd4d4e2ff34eae53284ec Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.121639 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dpxjq"] Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.135812 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pwqjw" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.206827 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qhm9q"] Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.214078 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bvn26" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.221320 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:12:37 crc kubenswrapper[4744]: W0930 03:12:37.236501 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod982a253c_4987_43f1_896c_1ce2fa503826.slice/crio-81679733c8f4330d18307a74bc460f6f8858d568df6a938e3efb4141e1271c41 WatchSource:0}: Error finding container 81679733c8f4330d18307a74bc460f6f8858d568df6a938e3efb4141e1271c41: Status 404 returned error can't find the container with id 81679733c8f4330d18307a74bc460f6f8858d568df6a938e3efb4141e1271c41 Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.491868 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758dd45f85-fgxnt" event={"ID":"66d32c30-b69c-4637-97a9-c1112b954a92","Type":"ContainerStarted","Data":"299c19b6853a008074c6f722837c470a495d6d2854505b1e154d1de45e813530"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.494015 4744 generic.go:334] "Generic (PLEG): container finished" podID="38f32dab-478a-48a5-bdce-066bf22d4367" containerID="8bd923d140def36bc0bbf9dbca916897b794922b3a8ddfa38c51a1205a82c011" exitCode=0 Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.494060 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-wq8th" event={"ID":"38f32dab-478a-48a5-bdce-066bf22d4367","Type":"ContainerDied","Data":"8bd923d140def36bc0bbf9dbca916897b794922b3a8ddfa38c51a1205a82c011"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.494075 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-wq8th" event={"ID":"38f32dab-478a-48a5-bdce-066bf22d4367","Type":"ContainerStarted","Data":"3c1ebf4fe252aa9eb69a88be808e598c57858d55b822147232ca632877cde331"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.496077 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" event={"ID":"982a253c-4987-43f1-896c-1ce2fa503826","Type":"ContainerStarted","Data":"81679733c8f4330d18307a74bc460f6f8858d568df6a938e3efb4141e1271c41"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.519924 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4665" event={"ID":"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8","Type":"ContainerStarted","Data":"3a22f35c6d1fc6b15165df25b9cf2424bf9c5f9423ecc4446132535757f7b8f9"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.520409 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4665" event={"ID":"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8","Type":"ContainerStarted","Data":"7722736858fbf776b849db84dd42301838b090460cebd427647c8779359dd2e0"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.520422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerStarted","Data":"8306c8a75be436fa60369da62fb6f3bf76195618f96a45456fd6e25e5a834ac4"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.520432 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dpxjq" event={"ID":"6d629c05-1300-4fb5-8f08-211a133fffe8","Type":"ContainerStarted","Data":"ade53214afd893e70d03b78d08eee4e09d58d22efbfdd4d4e2ff34eae53284ec"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.520444 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796557ff95-kphjm" event={"ID":"aa042742-a24d-4cf6-aecf-20b41b3287b4","Type":"ContainerStarted","Data":"636efd01de4d3323befef167f6d75ea8d5faaaa8707bbaaff82ca7a8bd183d68"} Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.538873 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w4665" podStartSLOduration=2.538848506 podStartE2EDuration="2.538848506s" podCreationTimestamp="2025-09-30 03:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:12:37.534900774 +0000 UTC m=+1084.708120748" watchObservedRunningTime="2025-09-30 03:12:37.538848506 +0000 UTC m=+1084.712068470" Sep 30 03:12:37 crc kubenswrapper[4744]: I0930 03:12:37.859333 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758dd45f85-fgxnt"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.037477 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f44fcbcd7-cpnxd"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.053924 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f44fcbcd7-cpnxd"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.054033 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.167859 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da7b1846-158a-430c-8c16-f9db6bfcaaf7-logs\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.167899 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcw5h\" (UniqueName: \"kubernetes.io/projected/da7b1846-158a-430c-8c16-f9db6bfcaaf7-kube-api-access-kcw5h\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.167955 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-scripts\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.168046 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da7b1846-158a-430c-8c16-f9db6bfcaaf7-horizon-secret-key\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.168065 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-config-data\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.247552 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.269963 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-config-data\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.270026 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw5h\" (UniqueName: \"kubernetes.io/projected/da7b1846-158a-430c-8c16-f9db6bfcaaf7-kube-api-access-kcw5h\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.270047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da7b1846-158a-430c-8c16-f9db6bfcaaf7-logs\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.270112 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-scripts\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.270257 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da7b1846-158a-430c-8c16-f9db6bfcaaf7-horizon-secret-key\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.270649 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da7b1846-158a-430c-8c16-f9db6bfcaaf7-logs\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.271087 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-scripts\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.271184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-config-data\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.279073 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da7b1846-158a-430c-8c16-f9db6bfcaaf7-horizon-secret-key\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.285899 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw5h\" (UniqueName: \"kubernetes.io/projected/da7b1846-158a-430c-8c16-f9db6bfcaaf7-kube-api-access-kcw5h\") pod \"horizon-6f44fcbcd7-cpnxd\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.341653 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.418148 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8h7n\" (UniqueName: \"kubernetes.io/projected/38f32dab-478a-48a5-bdce-066bf22d4367-kube-api-access-p8h7n\") pod \"38f32dab-478a-48a5-bdce-066bf22d4367\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.418196 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-swift-storage-0\") pod \"38f32dab-478a-48a5-bdce-066bf22d4367\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.418251 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-sb\") pod \"38f32dab-478a-48a5-bdce-066bf22d4367\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.418350 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-nb\") pod \"38f32dab-478a-48a5-bdce-066bf22d4367\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.418397 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-config\") pod \"38f32dab-478a-48a5-bdce-066bf22d4367\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.418443 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-svc\") pod \"38f32dab-478a-48a5-bdce-066bf22d4367\" (UID: \"38f32dab-478a-48a5-bdce-066bf22d4367\") " Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.420597 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.434550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f32dab-478a-48a5-bdce-066bf22d4367-kube-api-access-p8h7n" (OuterVolumeSpecName: "kube-api-access-p8h7n") pod "38f32dab-478a-48a5-bdce-066bf22d4367" (UID: "38f32dab-478a-48a5-bdce-066bf22d4367"). InnerVolumeSpecName "kube-api-access-p8h7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.450215 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38f32dab-478a-48a5-bdce-066bf22d4367" (UID: "38f32dab-478a-48a5-bdce-066bf22d4367"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.458976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38f32dab-478a-48a5-bdce-066bf22d4367" (UID: "38f32dab-478a-48a5-bdce-066bf22d4367"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.468030 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-75phb"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.474474 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bvn26"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.484202 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5rqpd"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.488223 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38f32dab-478a-48a5-bdce-066bf22d4367" (UID: "38f32dab-478a-48a5-bdce-066bf22d4367"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.490344 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-config" (OuterVolumeSpecName: "config") pod "38f32dab-478a-48a5-bdce-066bf22d4367" (UID: "38f32dab-478a-48a5-bdce-066bf22d4367"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:38 crc kubenswrapper[4744]: W0930 03:12:38.492063 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb118b5fa_982e_4bd6_a6dc_5d2015b3b399.slice/crio-e5ef8cec6dbfaec5e189072ecb59225c33fd0f6dca07d644897408b1481a0338 WatchSource:0}: Error finding container e5ef8cec6dbfaec5e189072ecb59225c33fd0f6dca07d644897408b1481a0338: Status 404 returned error can't find the container with id e5ef8cec6dbfaec5e189072ecb59225c33fd0f6dca07d644897408b1481a0338 Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.499486 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38f32dab-478a-48a5-bdce-066bf22d4367" (UID: "38f32dab-478a-48a5-bdce-066bf22d4367"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.523811 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8h7n\" (UniqueName: \"kubernetes.io/projected/38f32dab-478a-48a5-bdce-066bf22d4367-kube-api-access-p8h7n\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.523842 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.523851 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.523860 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.523869 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.523877 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f32dab-478a-48a5-bdce-066bf22d4367-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.541880 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5rqpd" event={"ID":"b10ecb07-4d75-4842-a753-f76c3a1d3b62","Type":"ContainerStarted","Data":"2056bb4fa7439873b7ed2093f332c425fdbec7d52bbd7a2be5bba0a4c3ebd049"} Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.551860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-wq8th" event={"ID":"38f32dab-478a-48a5-bdce-066bf22d4367","Type":"ContainerDied","Data":"3c1ebf4fe252aa9eb69a88be808e598c57858d55b822147232ca632877cde331"} Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.551913 4744 scope.go:117] "RemoveContainer" containerID="8bd923d140def36bc0bbf9dbca916897b794922b3a8ddfa38c51a1205a82c011" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.551916 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-wq8th" Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.556815 4744 generic.go:334] "Generic (PLEG): container finished" podID="982a253c-4987-43f1-896c-1ce2fa503826" containerID="caf53f62d474486143d892735af2330b6b693b527cee2c72d14e97c7a8fa37f3" exitCode=0 Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.556874 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" event={"ID":"982a253c-4987-43f1-896c-1ce2fa503826","Type":"ContainerDied","Data":"caf53f62d474486143d892735af2330b6b693b527cee2c72d14e97c7a8fa37f3"} Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.566317 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-75phb" event={"ID":"b118b5fa-982e-4bd6-a6dc-5d2015b3b399","Type":"ContainerStarted","Data":"e5ef8cec6dbfaec5e189072ecb59225c33fd0f6dca07d644897408b1481a0338"} Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.567942 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bvn26" event={"ID":"72b19763-eb29-45ca-9431-8791543dee83","Type":"ContainerStarted","Data":"e6fdfecf1be38a65cd4b39854056f86579d49e65ba305543a1df6ba88e907bb7"} Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.570167 4744 generic.go:334] "Generic (PLEG): container finished" podID="dab1acfa-0313-4621-9d6e-6ab34807d0e5" containerID="736e67b520621626ade8ae88a72df0e7390e168b4daadd51e756193ff9493e52" exitCode=0 Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.570457 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rzbkf" event={"ID":"dab1acfa-0313-4621-9d6e-6ab34807d0e5","Type":"ContainerDied","Data":"736e67b520621626ade8ae88a72df0e7390e168b4daadd51e756193ff9493e52"} Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.575007 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-pwqjw"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.666527 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wq8th"] Sep 30 03:12:38 crc kubenswrapper[4744]: I0930 03:12:38.672417 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-wq8th"] Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.033313 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f44fcbcd7-cpnxd"] Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.515155 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f32dab-478a-48a5-bdce-066bf22d4367" path="/var/lib/kubelet/pods/38f32dab-478a-48a5-bdce-066bf22d4367/volumes" Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.590491 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" event={"ID":"982a253c-4987-43f1-896c-1ce2fa503826","Type":"ContainerStarted","Data":"04888bbc1ac4e2b17a972fcd005ed77ada3a0ef86faa7ce6f737064368a93eb8"} Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.590639 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.591955 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pwqjw" event={"ID":"a24c42a2-4afa-4c32-ba87-18251fd1345a","Type":"ContainerStarted","Data":"1518c5f2cdb2a35b74e22c57f80274f509823a96a590622875d658645b7aeed9"} Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.599026 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f44fcbcd7-cpnxd" event={"ID":"da7b1846-158a-430c-8c16-f9db6bfcaaf7","Type":"ContainerStarted","Data":"1a44e8ab299c7c2d4325ebbb1dadb1b82777b72b461b9e5eb4f9ad2749fb9506"} Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.602532 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5rqpd" event={"ID":"b10ecb07-4d75-4842-a753-f76c3a1d3b62","Type":"ContainerStarted","Data":"57edffd5dfc895f964c798f7ed818242474b6c5b30da74d79cc484ce03eb0c81"} Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.618315 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" podStartSLOduration=3.6182949129999997 podStartE2EDuration="3.618294913s" podCreationTimestamp="2025-09-30 03:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:12:39.6149699 +0000 UTC m=+1086.788189874" watchObservedRunningTime="2025-09-30 03:12:39.618294913 +0000 UTC m=+1086.791514887" Sep 30 03:12:39 crc kubenswrapper[4744]: I0930 03:12:39.637521 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-5rqpd" podStartSLOduration=3.637505959 podStartE2EDuration="3.637505959s" podCreationTimestamp="2025-09-30 03:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:12:39.631647567 +0000 UTC m=+1086.804867541" watchObservedRunningTime="2025-09-30 03:12:39.637505959 +0000 UTC m=+1086.810725933" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.022421 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.163710 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqcdr\" (UniqueName: \"kubernetes.io/projected/dab1acfa-0313-4621-9d6e-6ab34807d0e5-kube-api-access-kqcdr\") pod \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.163796 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-db-sync-config-data\") pod \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.163862 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-config-data\") pod \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.164040 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-combined-ca-bundle\") pod \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\" (UID: \"dab1acfa-0313-4621-9d6e-6ab34807d0e5\") " Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.169633 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab1acfa-0313-4621-9d6e-6ab34807d0e5-kube-api-access-kqcdr" (OuterVolumeSpecName: "kube-api-access-kqcdr") pod "dab1acfa-0313-4621-9d6e-6ab34807d0e5" (UID: "dab1acfa-0313-4621-9d6e-6ab34807d0e5"). InnerVolumeSpecName "kube-api-access-kqcdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.170264 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dab1acfa-0313-4621-9d6e-6ab34807d0e5" (UID: "dab1acfa-0313-4621-9d6e-6ab34807d0e5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.192593 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dab1acfa-0313-4621-9d6e-6ab34807d0e5" (UID: "dab1acfa-0313-4621-9d6e-6ab34807d0e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.211908 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-config-data" (OuterVolumeSpecName: "config-data") pod "dab1acfa-0313-4621-9d6e-6ab34807d0e5" (UID: "dab1acfa-0313-4621-9d6e-6ab34807d0e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.266580 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqcdr\" (UniqueName: \"kubernetes.io/projected/dab1acfa-0313-4621-9d6e-6ab34807d0e5-kube-api-access-kqcdr\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.266644 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.266655 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.266690 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab1acfa-0313-4621-9d6e-6ab34807d0e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:40 crc kubenswrapper[4744]: E0930 03:12:40.376691 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69fa47c0_aec4_4cf8_9882_1e856ebfaaf8.slice/crio-3a22f35c6d1fc6b15165df25b9cf2424bf9c5f9423ecc4446132535757f7b8f9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69fa47c0_aec4_4cf8_9882_1e856ebfaaf8.slice/crio-conmon-3a22f35c6d1fc6b15165df25b9cf2424bf9c5f9423ecc4446132535757f7b8f9.scope\": RecentStats: unable to find data in memory cache]" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.611174 4744 generic.go:334] "Generic (PLEG): container finished" podID="69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" containerID="3a22f35c6d1fc6b15165df25b9cf2424bf9c5f9423ecc4446132535757f7b8f9" exitCode=0 Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.611260 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4665" event={"ID":"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8","Type":"ContainerDied","Data":"3a22f35c6d1fc6b15165df25b9cf2424bf9c5f9423ecc4446132535757f7b8f9"} Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.613227 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rzbkf" event={"ID":"dab1acfa-0313-4621-9d6e-6ab34807d0e5","Type":"ContainerDied","Data":"122c23c906878a2495de0117a8648ecaa4014fa78803ee086b12152e1c05fc87"} Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.613253 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="122c23c906878a2495de0117a8648ecaa4014fa78803ee086b12152e1c05fc87" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.613307 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rzbkf" Sep 30 03:12:40 crc kubenswrapper[4744]: I0930 03:12:40.991107 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qhm9q"] Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.047024 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-6htj2"] Sep 30 03:12:41 crc kubenswrapper[4744]: E0930 03:12:41.047384 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f32dab-478a-48a5-bdce-066bf22d4367" containerName="init" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.047399 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f32dab-478a-48a5-bdce-066bf22d4367" containerName="init" Sep 30 03:12:41 crc kubenswrapper[4744]: E0930 03:12:41.047434 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab1acfa-0313-4621-9d6e-6ab34807d0e5" containerName="glance-db-sync" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.047439 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab1acfa-0313-4621-9d6e-6ab34807d0e5" containerName="glance-db-sync" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.047599 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab1acfa-0313-4621-9d6e-6ab34807d0e5" containerName="glance-db-sync" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.047619 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f32dab-478a-48a5-bdce-066bf22d4367" containerName="init" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.048447 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.070585 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-6htj2"] Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.187620 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-config\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.187685 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.187708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.187952 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.188001 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvjw\" (UniqueName: \"kubernetes.io/projected/dfa52756-287f-4dad-941c-c4d0c44d93f9-kube-api-access-vtvjw\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.188024 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.289052 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.289092 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvjw\" (UniqueName: \"kubernetes.io/projected/dfa52756-287f-4dad-941c-c4d0c44d93f9-kube-api-access-vtvjw\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.289182 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-config\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.289220 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.289234 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.289279 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.290024 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.290089 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.290171 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.290264 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-config\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.290357 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.306745 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvjw\" (UniqueName: \"kubernetes.io/projected/dfa52756-287f-4dad-941c-c4d0c44d93f9-kube-api-access-vtvjw\") pod \"dnsmasq-dns-8b5c85b87-6htj2\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.377058 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:12:41 crc kubenswrapper[4744]: I0930 03:12:41.625145 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" podUID="982a253c-4987-43f1-896c-1ce2fa503826" containerName="dnsmasq-dns" containerID="cri-o://04888bbc1ac4e2b17a972fcd005ed77ada3a0ef86faa7ce6f737064368a93eb8" gracePeriod=10 Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.022400 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.026093 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.030097 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.030097 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.030533 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.030632 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rt5lh" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.035293 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.123605 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.127763 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.131455 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.154571 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213307 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-ceph\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213605 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-logs\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-config-data\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213662 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-scripts\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213708 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213751 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213775 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.213791 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmqdn\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-kube-api-access-vmqdn\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.314790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-scripts\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.314870 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.314906 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.314933 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpz97\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-kube-api-access-xpz97\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.314954 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.314978 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315003 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315023 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmqdn\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-kube-api-access-vmqdn\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315050 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315067 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-ceph\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315144 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315167 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315205 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-ceph\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315226 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-logs\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.315249 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-config-data\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.316001 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-logs\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.316051 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.318745 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.332337 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-scripts\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.333081 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-ceph\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.333270 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.333326 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-config-data\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.336617 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmqdn\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-kube-api-access-vmqdn\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.355109 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.416830 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.416880 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-ceph\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.416910 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.416937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.417065 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.417161 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.417184 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpz97\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-kube-api-access-xpz97\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.417208 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.417336 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.417442 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.422116 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-config-data\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.424972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-logs\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.425882 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.428493 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-scripts\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.434149 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-ceph\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.438713 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpz97\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-kube-api-access-xpz97\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.446188 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.637491 4744 generic.go:334] "Generic (PLEG): container finished" podID="982a253c-4987-43f1-896c-1ce2fa503826" containerID="04888bbc1ac4e2b17a972fcd005ed77ada3a0ef86faa7ce6f737064368a93eb8" exitCode=0 Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.637544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" event={"ID":"982a253c-4987-43f1-896c-1ce2fa503826","Type":"ContainerDied","Data":"04888bbc1ac4e2b17a972fcd005ed77ada3a0ef86faa7ce6f737064368a93eb8"} Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.648019 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:12:42 crc kubenswrapper[4744]: I0930 03:12:42.748019 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.497174 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.637540 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-config-data\") pod \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.637684 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-scripts\") pod \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.637721 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-combined-ca-bundle\") pod \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.637831 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wphxf\" (UniqueName: \"kubernetes.io/projected/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-kube-api-access-wphxf\") pod \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.637907 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-fernet-keys\") pod \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.637928 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-credential-keys\") pod \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\" (UID: \"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8\") " Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.648684 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-scripts" (OuterVolumeSpecName: "scripts") pod "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" (UID: "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.648718 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" (UID: "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.648773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-kube-api-access-wphxf" (OuterVolumeSpecName: "kube-api-access-wphxf") pod "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" (UID: "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8"). InnerVolumeSpecName "kube-api-access-wphxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.653506 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" (UID: "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.676605 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4665" event={"ID":"69fa47c0-aec4-4cf8-9882-1e856ebfaaf8","Type":"ContainerDied","Data":"7722736858fbf776b849db84dd42301838b090460cebd427647c8779359dd2e0"} Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.677733 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7722736858fbf776b849db84dd42301838b090460cebd427647c8779359dd2e0" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.677652 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4665" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.679389 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" (UID: "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.726035 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-config-data" (OuterVolumeSpecName: "config-data") pod "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" (UID: "69fa47c0-aec4-4cf8-9882-1e856ebfaaf8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.740405 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.740437 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.740449 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wphxf\" (UniqueName: \"kubernetes.io/projected/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-kube-api-access-wphxf\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.740461 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.740471 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:43 crc kubenswrapper[4744]: I0930 03:12:43.740480 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.583717 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w4665"] Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.589845 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w4665"] Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.681780 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nm8l8"] Sep 30 03:12:44 crc kubenswrapper[4744]: E0930 03:12:44.682493 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" containerName="keystone-bootstrap" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.682516 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" containerName="keystone-bootstrap" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.693136 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" containerName="keystone-bootstrap" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.693881 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.698273 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nm8l8"] Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.709134 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.709166 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.709559 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.709980 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g28pb" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.761877 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-combined-ca-bundle\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.761952 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-scripts\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.762001 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-fernet-keys\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.762026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-config-data\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.762090 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxbg\" (UniqueName: \"kubernetes.io/projected/08504742-967f-491a-a3ab-9ddcadb556c4-kube-api-access-xwxbg\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.762162 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-credential-keys\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.864256 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-config-data\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.864429 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxbg\" (UniqueName: \"kubernetes.io/projected/08504742-967f-491a-a3ab-9ddcadb556c4-kube-api-access-xwxbg\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.864520 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-credential-keys\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.864548 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-combined-ca-bundle\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.864585 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-scripts\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.864627 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-fernet-keys\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.870882 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-combined-ca-bundle\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.881647 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-config-data\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.881761 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-scripts\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.883032 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-credential-keys\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.884722 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-fernet-keys\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:44 crc kubenswrapper[4744]: I0930 03:12:44.886930 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxbg\" (UniqueName: \"kubernetes.io/projected/08504742-967f-491a-a3ab-9ddcadb556c4-kube-api-access-xwxbg\") pod \"keystone-bootstrap-nm8l8\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:45 crc kubenswrapper[4744]: I0930 03:12:45.036070 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:12:45 crc kubenswrapper[4744]: I0930 03:12:45.515444 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69fa47c0-aec4-4cf8-9882-1e856ebfaaf8" path="/var/lib/kubelet/pods/69fa47c0-aec4-4cf8-9882-1e856ebfaaf8/volumes" Sep 30 03:12:46 crc kubenswrapper[4744]: I0930 03:12:46.926443 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:12:46 crc kubenswrapper[4744]: I0930 03:12:46.999890 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.379888 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-796557ff95-kphjm"] Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.414547 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-787b588c76-v5mnn"] Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.416257 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.420031 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.432163 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-787b588c76-v5mnn"] Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.469878 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f44fcbcd7-cpnxd"] Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.492394 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78db449746-kg7zl"] Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.493767 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.505112 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78db449746-kg7zl"] Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.570800 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-combined-ca-bundle\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.570849 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-secret-key\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.570892 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f214ceb-c91a-4672-8711-9728a3f5e3f3-logs\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.570959 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lr98\" (UniqueName: \"kubernetes.io/projected/1f214ceb-c91a-4672-8711-9728a3f5e3f3-kube-api-access-6lr98\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.570989 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-scripts\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.571007 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-config-data\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.571038 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-tls-certs\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673548 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-combined-ca-bundle\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673616 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff31735f-472e-4b3a-8d81-bc5c392aec09-config-data\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673642 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-horizon-tls-certs\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673662 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-secret-key\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673704 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f214ceb-c91a-4672-8711-9728a3f5e3f3-logs\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lr98\" (UniqueName: \"kubernetes.io/projected/1f214ceb-c91a-4672-8711-9728a3f5e3f3-kube-api-access-6lr98\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673768 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff31735f-472e-4b3a-8d81-bc5c392aec09-scripts\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673795 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-scripts\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673815 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-config-data\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673867 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-tls-certs\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673900 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-horizon-secret-key\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fslzm\" (UniqueName: \"kubernetes.io/projected/ff31735f-472e-4b3a-8d81-bc5c392aec09-kube-api-access-fslzm\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673937 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff31735f-472e-4b3a-8d81-bc5c392aec09-logs\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.673978 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-combined-ca-bundle\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.675242 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f214ceb-c91a-4672-8711-9728a3f5e3f3-logs\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.676094 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-config-data\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.676688 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-scripts\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.679225 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-tls-certs\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.681787 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-secret-key\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.690290 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lr98\" (UniqueName: \"kubernetes.io/projected/1f214ceb-c91a-4672-8711-9728a3f5e3f3-kube-api-access-6lr98\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.697034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-combined-ca-bundle\") pod \"horizon-787b588c76-v5mnn\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.743362 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.796764 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-horizon-tls-certs\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.798057 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff31735f-472e-4b3a-8d81-bc5c392aec09-scripts\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.798301 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-horizon-secret-key\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.798405 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fslzm\" (UniqueName: \"kubernetes.io/projected/ff31735f-472e-4b3a-8d81-bc5c392aec09-kube-api-access-fslzm\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.798529 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff31735f-472e-4b3a-8d81-bc5c392aec09-logs\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.798625 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-combined-ca-bundle\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.799049 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff31735f-472e-4b3a-8d81-bc5c392aec09-config-data\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.800186 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff31735f-472e-4b3a-8d81-bc5c392aec09-logs\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.800658 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff31735f-472e-4b3a-8d81-bc5c392aec09-scripts\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.801067 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff31735f-472e-4b3a-8d81-bc5c392aec09-config-data\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.802658 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-horizon-tls-certs\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.803983 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-combined-ca-bundle\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.815971 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fslzm\" (UniqueName: \"kubernetes.io/projected/ff31735f-472e-4b3a-8d81-bc5c392aec09-kube-api-access-fslzm\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:48 crc kubenswrapper[4744]: I0930 03:12:48.818180 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff31735f-472e-4b3a-8d81-bc5c392aec09-horizon-secret-key\") pod \"horizon-78db449746-kg7zl\" (UID: \"ff31735f-472e-4b3a-8d81-bc5c392aec09\") " pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.049425 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.114941 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.204785 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-config\") pod \"982a253c-4987-43f1-896c-1ce2fa503826\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.204904 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-nb\") pod \"982a253c-4987-43f1-896c-1ce2fa503826\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.205000 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-swift-storage-0\") pod \"982a253c-4987-43f1-896c-1ce2fa503826\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.205065 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-sb\") pod \"982a253c-4987-43f1-896c-1ce2fa503826\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.205090 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7rhh\" (UniqueName: \"kubernetes.io/projected/982a253c-4987-43f1-896c-1ce2fa503826-kube-api-access-p7rhh\") pod \"982a253c-4987-43f1-896c-1ce2fa503826\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.205142 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-svc\") pod \"982a253c-4987-43f1-896c-1ce2fa503826\" (UID: \"982a253c-4987-43f1-896c-1ce2fa503826\") " Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.209301 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982a253c-4987-43f1-896c-1ce2fa503826-kube-api-access-p7rhh" (OuterVolumeSpecName: "kube-api-access-p7rhh") pod "982a253c-4987-43f1-896c-1ce2fa503826" (UID: "982a253c-4987-43f1-896c-1ce2fa503826"). InnerVolumeSpecName "kube-api-access-p7rhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.258580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "982a253c-4987-43f1-896c-1ce2fa503826" (UID: "982a253c-4987-43f1-896c-1ce2fa503826"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.259014 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-config" (OuterVolumeSpecName: "config") pod "982a253c-4987-43f1-896c-1ce2fa503826" (UID: "982a253c-4987-43f1-896c-1ce2fa503826"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.260408 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "982a253c-4987-43f1-896c-1ce2fa503826" (UID: "982a253c-4987-43f1-896c-1ce2fa503826"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.267072 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "982a253c-4987-43f1-896c-1ce2fa503826" (UID: "982a253c-4987-43f1-896c-1ce2fa503826"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.273313 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "982a253c-4987-43f1-896c-1ce2fa503826" (UID: "982a253c-4987-43f1-896c-1ce2fa503826"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.307718 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.307749 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.307760 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.307769 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7rhh\" (UniqueName: \"kubernetes.io/projected/982a253c-4987-43f1-896c-1ce2fa503826-kube-api-access-p7rhh\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.307780 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.307788 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/982a253c-4987-43f1-896c-1ce2fa503826-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.770854 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" event={"ID":"982a253c-4987-43f1-896c-1ce2fa503826","Type":"ContainerDied","Data":"81679733c8f4330d18307a74bc460f6f8858d568df6a938e3efb4141e1271c41"} Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.770934 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.771172 4744 scope.go:117] "RemoveContainer" containerID="04888bbc1ac4e2b17a972fcd005ed77ada3a0ef86faa7ce6f737064368a93eb8" Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.800143 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qhm9q"] Sep 30 03:12:49 crc kubenswrapper[4744]: I0930 03:12:49.806610 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qhm9q"] Sep 30 03:12:51 crc kubenswrapper[4744]: I0930 03:12:51.517268 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982a253c-4987-43f1-896c-1ce2fa503826" path="/var/lib/kubelet/pods/982a253c-4987-43f1-896c-1ce2fa503826/volumes" Sep 30 03:12:51 crc kubenswrapper[4744]: I0930 03:12:51.635483 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-qhm9q" podUID="982a253c-4987-43f1-896c-1ce2fa503826" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: i/o timeout" Sep 30 03:12:54 crc kubenswrapper[4744]: E0930 03:12:54.049860 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 03:12:54 crc kubenswrapper[4744]: E0930 03:12:54.050462 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55dhddh79h55h5c4hc9h5fdh5c8hf7hfdh9dh65bh55dh99hfdh89h597h64fh646h56h599h6h8bh77h65ch59h56ch65bhb9h6dh9ch5d5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcw5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f44fcbcd7-cpnxd_openstack(da7b1846-158a-430c-8c16-f9db6bfcaaf7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:12:54 crc kubenswrapper[4744]: E0930 03:12:54.052255 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f44fcbcd7-cpnxd" podUID="da7b1846-158a-430c-8c16-f9db6bfcaaf7" Sep 30 03:12:55 crc kubenswrapper[4744]: I0930 03:12:55.823400 4744 generic.go:334] "Generic (PLEG): container finished" podID="b10ecb07-4d75-4842-a753-f76c3a1d3b62" containerID="57edffd5dfc895f964c798f7ed818242474b6c5b30da74d79cc484ce03eb0c81" exitCode=0 Sep 30 03:12:55 crc kubenswrapper[4744]: I0930 03:12:55.823684 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5rqpd" event={"ID":"b10ecb07-4d75-4842-a753-f76c3a1d3b62","Type":"ContainerDied","Data":"57edffd5dfc895f964c798f7ed818242474b6c5b30da74d79cc484ce03eb0c81"} Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.118969 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.119469 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bnrg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-75phb_openstack(b118b5fa-982e-4bd6-a6dc-5d2015b3b399): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.120632 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-75phb" podUID="b118b5fa-982e-4bd6-a6dc-5d2015b3b399" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.347792 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.347883 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.568793 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.569234 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4skk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-pwqjw_openstack(a24c42a2-4afa-4c32-ba87-18251fd1345a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.570602 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-pwqjw" podUID="a24c42a2-4afa-4c32-ba87-18251fd1345a" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.648052 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.832285 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-config-data\") pod \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.832454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcw5h\" (UniqueName: \"kubernetes.io/projected/da7b1846-158a-430c-8c16-f9db6bfcaaf7-kube-api-access-kcw5h\") pod \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.832510 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da7b1846-158a-430c-8c16-f9db6bfcaaf7-horizon-secret-key\") pod \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.832641 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da7b1846-158a-430c-8c16-f9db6bfcaaf7-logs\") pod \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.832730 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-scripts\") pod \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\" (UID: \"da7b1846-158a-430c-8c16-f9db6bfcaaf7\") " Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.834540 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-scripts" (OuterVolumeSpecName: "scripts") pod "da7b1846-158a-430c-8c16-f9db6bfcaaf7" (UID: "da7b1846-158a-430c-8c16-f9db6bfcaaf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.835687 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-config-data" (OuterVolumeSpecName: "config-data") pod "da7b1846-158a-430c-8c16-f9db6bfcaaf7" (UID: "da7b1846-158a-430c-8c16-f9db6bfcaaf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.837844 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da7b1846-158a-430c-8c16-f9db6bfcaaf7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "da7b1846-158a-430c-8c16-f9db6bfcaaf7" (UID: "da7b1846-158a-430c-8c16-f9db6bfcaaf7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.840088 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7b1846-158a-430c-8c16-f9db6bfcaaf7-kube-api-access-kcw5h" (OuterVolumeSpecName: "kube-api-access-kcw5h") pod "da7b1846-158a-430c-8c16-f9db6bfcaaf7" (UID: "da7b1846-158a-430c-8c16-f9db6bfcaaf7"). InnerVolumeSpecName "kube-api-access-kcw5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.840266 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da7b1846-158a-430c-8c16-f9db6bfcaaf7-logs" (OuterVolumeSpecName: "logs") pod "da7b1846-158a-430c-8c16-f9db6bfcaaf7" (UID: "da7b1846-158a-430c-8c16-f9db6bfcaaf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.911777 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f44fcbcd7-cpnxd" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.914280 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f44fcbcd7-cpnxd" event={"ID":"da7b1846-158a-430c-8c16-f9db6bfcaaf7","Type":"ContainerDied","Data":"1a44e8ab299c7c2d4325ebbb1dadb1b82777b72b461b9e5eb4f9ad2749fb9506"} Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.916913 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-pwqjw" podUID="a24c42a2-4afa-4c32-ba87-18251fd1345a" Sep 30 03:13:04 crc kubenswrapper[4744]: E0930 03:13:04.917112 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-75phb" podUID="b118b5fa-982e-4bd6-a6dc-5d2015b3b399" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.934767 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.934792 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da7b1846-158a-430c-8c16-f9db6bfcaaf7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.934801 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcw5h\" (UniqueName: \"kubernetes.io/projected/da7b1846-158a-430c-8c16-f9db6bfcaaf7-kube-api-access-kcw5h\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.934811 4744 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da7b1846-158a-430c-8c16-f9db6bfcaaf7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:04 crc kubenswrapper[4744]: I0930 03:13:04.934819 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da7b1846-158a-430c-8c16-f9db6bfcaaf7-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.004446 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f44fcbcd7-cpnxd"] Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.011175 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f44fcbcd7-cpnxd"] Sep 30 03:13:05 crc kubenswrapper[4744]: E0930 03:13:05.073110 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Sep 30 03:13:05 crc kubenswrapper[4744]: E0930 03:13:05.073325 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7dh57bhc4h6ch77h5b9h5b5h5d9h5b7h5f8h5cfhfbh5c9h5d4h58fhd8h679h557h8h8ch568h56fh5ch58dh7dhc9h66fh55h7ch595hbh549q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kj846,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(c56c5a65-d4fe-4772-ba30-eae95674c422): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.113668 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.141311 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwp4m\" (UniqueName: \"kubernetes.io/projected/b10ecb07-4d75-4842-a753-f76c3a1d3b62-kube-api-access-vwp4m\") pod \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.142462 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-config\") pod \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.142497 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-combined-ca-bundle\") pod \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\" (UID: \"b10ecb07-4d75-4842-a753-f76c3a1d3b62\") " Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.146927 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b10ecb07-4d75-4842-a753-f76c3a1d3b62-kube-api-access-vwp4m" (OuterVolumeSpecName: "kube-api-access-vwp4m") pod "b10ecb07-4d75-4842-a753-f76c3a1d3b62" (UID: "b10ecb07-4d75-4842-a753-f76c3a1d3b62"). InnerVolumeSpecName "kube-api-access-vwp4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.167960 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b10ecb07-4d75-4842-a753-f76c3a1d3b62" (UID: "b10ecb07-4d75-4842-a753-f76c3a1d3b62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.173502 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-config" (OuterVolumeSpecName: "config") pod "b10ecb07-4d75-4842-a753-f76c3a1d3b62" (UID: "b10ecb07-4d75-4842-a753-f76c3a1d3b62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.245018 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwp4m\" (UniqueName: \"kubernetes.io/projected/b10ecb07-4d75-4842-a753-f76c3a1d3b62-kube-api-access-vwp4m\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.245410 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.245432 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10ecb07-4d75-4842-a753-f76c3a1d3b62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.522649 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7b1846-158a-430c-8c16-f9db6bfcaaf7" path="/var/lib/kubelet/pods/da7b1846-158a-430c-8c16-f9db6bfcaaf7/volumes" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.922873 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5rqpd" event={"ID":"b10ecb07-4d75-4842-a753-f76c3a1d3b62","Type":"ContainerDied","Data":"2056bb4fa7439873b7ed2093f332c425fdbec7d52bbd7a2be5bba0a4c3ebd049"} Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.922915 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2056bb4fa7439873b7ed2093f332c425fdbec7d52bbd7a2be5bba0a4c3ebd049" Sep 30 03:13:05 crc kubenswrapper[4744]: I0930 03:13:05.922945 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5rqpd" Sep 30 03:13:06 crc kubenswrapper[4744]: E0930 03:13:06.083125 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 30 03:13:06 crc kubenswrapper[4744]: E0930 03:13:06.083304 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2fcj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bvn26_openstack(72b19763-eb29-45ca-9431-8791543dee83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:13:06 crc kubenswrapper[4744]: E0930 03:13:06.084494 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bvn26" podUID="72b19763-eb29-45ca-9431-8791543dee83" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.096740 4744 scope.go:117] "RemoveContainer" containerID="caf53f62d474486143d892735af2330b6b693b527cee2c72d14e97c7a8fa37f3" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.308092 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-6htj2"] Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.341804 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5dd595ddb6-2wvzb"] Sep 30 03:13:06 crc kubenswrapper[4744]: E0930 03:13:06.342177 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982a253c-4987-43f1-896c-1ce2fa503826" containerName="init" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.342188 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="982a253c-4987-43f1-896c-1ce2fa503826" containerName="init" Sep 30 03:13:06 crc kubenswrapper[4744]: E0930 03:13:06.342210 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982a253c-4987-43f1-896c-1ce2fa503826" containerName="dnsmasq-dns" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.342216 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="982a253c-4987-43f1-896c-1ce2fa503826" containerName="dnsmasq-dns" Sep 30 03:13:06 crc kubenswrapper[4744]: E0930 03:13:06.342232 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b10ecb07-4d75-4842-a753-f76c3a1d3b62" containerName="neutron-db-sync" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.342239 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10ecb07-4d75-4842-a753-f76c3a1d3b62" containerName="neutron-db-sync" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.342412 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b10ecb07-4d75-4842-a753-f76c3a1d3b62" containerName="neutron-db-sync" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.342427 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="982a253c-4987-43f1-896c-1ce2fa503826" containerName="dnsmasq-dns" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.343247 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.352337 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.352534 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vvvwl" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.352633 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.352730 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.357502 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-jc7k5"] Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.358899 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.369160 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-httpd-config\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.369410 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.369535 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.369631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-ovndb-tls-certs\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.369730 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s5bm\" (UniqueName: \"kubernetes.io/projected/6ca18f47-7a09-4040-89f2-0b8c3f77a032-kube-api-access-4s5bm\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.369841 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.369947 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-config\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.370044 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.370152 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-combined-ca-bundle\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.370254 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-config\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.370409 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2k7\" (UniqueName: \"kubernetes.io/projected/d8d97540-160a-4b25-9a0a-7ee3c27775f3-kube-api-access-jd2k7\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.388693 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dd595ddb6-2wvzb"] Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.415602 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-jc7k5"] Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-combined-ca-bundle\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472589 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-config\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472645 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2k7\" (UniqueName: \"kubernetes.io/projected/d8d97540-160a-4b25-9a0a-7ee3c27775f3-kube-api-access-jd2k7\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472681 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-httpd-config\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472724 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472744 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472763 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-ovndb-tls-certs\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472781 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s5bm\" (UniqueName: \"kubernetes.io/projected/6ca18f47-7a09-4040-89f2-0b8c3f77a032-kube-api-access-4s5bm\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472796 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472820 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-config\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.472840 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.475294 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.476013 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.476352 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.477799 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.483588 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-ovndb-tls-certs\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.486240 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-config\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.499544 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-config\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.506656 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s5bm\" (UniqueName: \"kubernetes.io/projected/6ca18f47-7a09-4040-89f2-0b8c3f77a032-kube-api-access-4s5bm\") pod \"dnsmasq-dns-84b966f6c9-jc7k5\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.516672 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-combined-ca-bundle\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.516770 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-httpd-config\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.519023 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2k7\" (UniqueName: \"kubernetes.io/projected/d8d97540-160a-4b25-9a0a-7ee3c27775f3-kube-api-access-jd2k7\") pod \"neutron-5dd595ddb6-2wvzb\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.524338 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.606305 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.746601 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-6htj2"] Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.952646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796557ff95-kphjm" event={"ID":"aa042742-a24d-4cf6-aecf-20b41b3287b4","Type":"ContainerStarted","Data":"a10058c3b49add8e4fe84a75fecff0f0ea29107b6cd728fe3c1119456e886646"} Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.959460 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" event={"ID":"dfa52756-287f-4dad-941c-c4d0c44d93f9","Type":"ContainerStarted","Data":"fcc3989aedfaeec7eb5c75f87d1e70c72361019b79e33ff2ea00824055540d33"} Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.962869 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dpxjq" event={"ID":"6d629c05-1300-4fb5-8f08-211a133fffe8","Type":"ContainerStarted","Data":"872fde3c707408ad072c82898450c2626de8273f8aab39941386a7ca03ae0278"} Sep 30 03:13:06 crc kubenswrapper[4744]: I0930 03:13:06.965501 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758dd45f85-fgxnt" event={"ID":"66d32c30-b69c-4637-97a9-c1112b954a92","Type":"ContainerStarted","Data":"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a"} Sep 30 03:13:06 crc kubenswrapper[4744]: E0930 03:13:06.966235 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-bvn26" podUID="72b19763-eb29-45ca-9431-8791543dee83" Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.006729 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-787b588c76-v5mnn"] Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.018148 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dpxjq" podStartSLOduration=3.592012058 podStartE2EDuration="31.018130058s" podCreationTimestamp="2025-09-30 03:12:36 +0000 UTC" firstStartedPulling="2025-09-30 03:12:37.122693835 +0000 UTC m=+1084.295913809" lastFinishedPulling="2025-09-30 03:13:04.548811835 +0000 UTC m=+1111.722031809" observedRunningTime="2025-09-30 03:13:06.989605563 +0000 UTC m=+1114.162825537" watchObservedRunningTime="2025-09-30 03:13:07.018130058 +0000 UTC m=+1114.191350032" Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.095967 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.170333 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78db449746-kg7zl"] Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.179595 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nm8l8"] Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.280687 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-jc7k5"] Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.367838 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dd595ddb6-2wvzb"] Sep 30 03:13:07 crc kubenswrapper[4744]: W0930 03:13:07.654697 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff31735f_472e_4b3a_8d81_bc5c392aec09.slice/crio-39834b7d299b74e6f0c956caec2a92365de9cdfccd350e7aca8d2bb8a3edbdcf WatchSource:0}: Error finding container 39834b7d299b74e6f0c956caec2a92365de9cdfccd350e7aca8d2bb8a3edbdcf: Status 404 returned error can't find the container with id 39834b7d299b74e6f0c956caec2a92365de9cdfccd350e7aca8d2bb8a3edbdcf Sep 30 03:13:07 crc kubenswrapper[4744]: W0930 03:13:07.662567 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8d97540_160a_4b25_9a0a_7ee3c27775f3.slice/crio-1940feb553390b8685ecbc3028c6999d9a3771a96c9277a3c711e558664cb23e WatchSource:0}: Error finding container 1940feb553390b8685ecbc3028c6999d9a3771a96c9277a3c711e558664cb23e: Status 404 returned error can't find the container with id 1940feb553390b8685ecbc3028c6999d9a3771a96c9277a3c711e558664cb23e Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.977307 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" event={"ID":"6ca18f47-7a09-4040-89f2-0b8c3f77a032","Type":"ContainerStarted","Data":"2bcd0da062e8b8d1eb14314c5be3c0c65998871700e8f938ef6a85b0c72ce88a"} Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.980277 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758dd45f85-fgxnt" event={"ID":"66d32c30-b69c-4637-97a9-c1112b954a92","Type":"ContainerStarted","Data":"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291"} Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.980441 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-758dd45f85-fgxnt" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon-log" containerID="cri-o://7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a" gracePeriod=30 Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.981851 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-758dd45f85-fgxnt" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon" containerID="cri-o://16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291" gracePeriod=30 Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.992509 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70a293af-94d1-411a-b43e-2a6cdf13fd07","Type":"ContainerStarted","Data":"1745b4330ca4766c148b342d6a79c602c8839a6e9e904960e7c5a8c9458aba55"} Sep 30 03:13:07 crc kubenswrapper[4744]: I0930 03:13:07.995032 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nm8l8" event={"ID":"08504742-967f-491a-a3ab-9ddcadb556c4","Type":"ContainerStarted","Data":"25204fdfe511df50a37438f552d19d51e47c05d5b9dab9c1e972070080754bcc"} Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.007396 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd595ddb6-2wvzb" event={"ID":"d8d97540-160a-4b25-9a0a-7ee3c27775f3","Type":"ContainerStarted","Data":"1940feb553390b8685ecbc3028c6999d9a3771a96c9277a3c711e558664cb23e"} Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.008263 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-758dd45f85-fgxnt" podStartSLOduration=5.314285106 podStartE2EDuration="33.008248447s" podCreationTimestamp="2025-09-30 03:12:35 +0000 UTC" firstStartedPulling="2025-09-30 03:12:36.859594282 +0000 UTC m=+1084.032814256" lastFinishedPulling="2025-09-30 03:13:04.553557603 +0000 UTC m=+1111.726777597" observedRunningTime="2025-09-30 03:13:08.00770502 +0000 UTC m=+1115.180924994" watchObservedRunningTime="2025-09-30 03:13:08.008248447 +0000 UTC m=+1115.181468421" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.011846 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-787b588c76-v5mnn" event={"ID":"1f214ceb-c91a-4672-8711-9728a3f5e3f3","Type":"ContainerStarted","Data":"c6afb023f98893d5e3a79976ecc11f66ceb9a1dedcb89d27e61f309090ccfaac"} Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.016412 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78db449746-kg7zl" event={"ID":"ff31735f-472e-4b3a-8d81-bc5c392aec09","Type":"ContainerStarted","Data":"39834b7d299b74e6f0c956caec2a92365de9cdfccd350e7aca8d2bb8a3edbdcf"} Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.020974 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796557ff95-kphjm" event={"ID":"aa042742-a24d-4cf6-aecf-20b41b3287b4","Type":"ContainerStarted","Data":"eac00480fcc09aea56d5c80100cc25aebb5eafdb2a9df36c2149d681a3e4665a"} Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.021041 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-796557ff95-kphjm" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon-log" containerID="cri-o://a10058c3b49add8e4fe84a75fecff0f0ea29107b6cd728fe3c1119456e886646" gracePeriod=30 Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.021197 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-796557ff95-kphjm" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon" containerID="cri-o://eac00480fcc09aea56d5c80100cc25aebb5eafdb2a9df36c2149d681a3e4665a" gracePeriod=30 Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.035844 4744 generic.go:334] "Generic (PLEG): container finished" podID="dfa52756-287f-4dad-941c-c4d0c44d93f9" containerID="9c57a266a446a6d8d4ab35271916442848419ae0514e6fbf43b0ff2b96f49ad4" exitCode=0 Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.036767 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" event={"ID":"dfa52756-287f-4dad-941c-c4d0c44d93f9","Type":"ContainerDied","Data":"9c57a266a446a6d8d4ab35271916442848419ae0514e6fbf43b0ff2b96f49ad4"} Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.040783 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-796557ff95-kphjm" podStartSLOduration=3.810310995 podStartE2EDuration="33.040757696s" podCreationTimestamp="2025-09-30 03:12:35 +0000 UTC" firstStartedPulling="2025-09-30 03:12:36.916145817 +0000 UTC m=+1084.089365791" lastFinishedPulling="2025-09-30 03:13:06.146592518 +0000 UTC m=+1113.319812492" observedRunningTime="2025-09-30 03:13:08.036952188 +0000 UTC m=+1115.210172162" watchObservedRunningTime="2025-09-30 03:13:08.040757696 +0000 UTC m=+1115.213977670" Sep 30 03:13:08 crc kubenswrapper[4744]: W0930 03:13:08.093646 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173283f1_6dba_4c82_806f_0319e6ab1785.slice/crio-71cd0d346838a9b1b89375141df775cd925ca4489a3666c4169bb43435ce0935 WatchSource:0}: Error finding container 71cd0d346838a9b1b89375141df775cd925ca4489a3666c4169bb43435ce0935: Status 404 returned error can't find the container with id 71cd0d346838a9b1b89375141df775cd925ca4489a3666c4169bb43435ce0935 Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.098755 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.409790 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.530505 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-sb\") pod \"dfa52756-287f-4dad-941c-c4d0c44d93f9\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.531346 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-nb\") pod \"dfa52756-287f-4dad-941c-c4d0c44d93f9\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.531410 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-swift-storage-0\") pod \"dfa52756-287f-4dad-941c-c4d0c44d93f9\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.531445 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-config\") pod \"dfa52756-287f-4dad-941c-c4d0c44d93f9\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.531494 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtvjw\" (UniqueName: \"kubernetes.io/projected/dfa52756-287f-4dad-941c-c4d0c44d93f9-kube-api-access-vtvjw\") pod \"dfa52756-287f-4dad-941c-c4d0c44d93f9\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.531560 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-svc\") pod \"dfa52756-287f-4dad-941c-c4d0c44d93f9\" (UID: \"dfa52756-287f-4dad-941c-c4d0c44d93f9\") " Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.541512 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa52756-287f-4dad-941c-c4d0c44d93f9-kube-api-access-vtvjw" (OuterVolumeSpecName: "kube-api-access-vtvjw") pod "dfa52756-287f-4dad-941c-c4d0c44d93f9" (UID: "dfa52756-287f-4dad-941c-c4d0c44d93f9"). InnerVolumeSpecName "kube-api-access-vtvjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.555346 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dfa52756-287f-4dad-941c-c4d0c44d93f9" (UID: "dfa52756-287f-4dad-941c-c4d0c44d93f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.561720 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfa52756-287f-4dad-941c-c4d0c44d93f9" (UID: "dfa52756-287f-4dad-941c-c4d0c44d93f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.575981 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfa52756-287f-4dad-941c-c4d0c44d93f9" (UID: "dfa52756-287f-4dad-941c-c4d0c44d93f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.602690 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-config" (OuterVolumeSpecName: "config") pod "dfa52756-287f-4dad-941c-c4d0c44d93f9" (UID: "dfa52756-287f-4dad-941c-c4d0c44d93f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.642816 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.642853 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.642864 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.642873 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtvjw\" (UniqueName: \"kubernetes.io/projected/dfa52756-287f-4dad-941c-c4d0c44d93f9-kube-api-access-vtvjw\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.642883 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.646288 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfa52756-287f-4dad-941c-c4d0c44d93f9" (UID: "dfa52756-287f-4dad-941c-c4d0c44d93f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.688558 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5578f9874f-7lb9c"] Sep 30 03:13:08 crc kubenswrapper[4744]: E0930 03:13:08.688947 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa52756-287f-4dad-941c-c4d0c44d93f9" containerName="init" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.688958 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa52756-287f-4dad-941c-c4d0c44d93f9" containerName="init" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.689143 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa52756-287f-4dad-941c-c4d0c44d93f9" containerName="init" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.698176 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.709026 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.709204 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.733521 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5578f9874f-7lb9c"] Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.744610 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfa52756-287f-4dad-941c-c4d0c44d93f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.846559 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-ovndb-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.846609 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-httpd-config\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.846686 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-combined-ca-bundle\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.846723 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-public-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.846756 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks9pn\" (UniqueName: \"kubernetes.io/projected/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-kube-api-access-ks9pn\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.846770 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-config\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.846812 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-internal-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.948145 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-combined-ca-bundle\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.948209 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-public-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.948235 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks9pn\" (UniqueName: \"kubernetes.io/projected/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-kube-api-access-ks9pn\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.948252 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-config\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.948310 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-internal-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.948356 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-ovndb-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.948390 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-httpd-config\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.962942 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-public-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.962947 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-internal-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.963158 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-config\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.964031 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-httpd-config\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.969714 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-combined-ca-bundle\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.975014 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-ovndb-tls-certs\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:08 crc kubenswrapper[4744]: I0930 03:13:08.989103 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks9pn\" (UniqueName: \"kubernetes.io/projected/ecbf3c72-f1cb-48fd-8823-3d3ae2040c86-kube-api-access-ks9pn\") pod \"neutron-5578f9874f-7lb9c\" (UID: \"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86\") " pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.070639 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-787b588c76-v5mnn" event={"ID":"1f214ceb-c91a-4672-8711-9728a3f5e3f3","Type":"ContainerStarted","Data":"4a0f5cb143bcfa57cbcb5bcafed97ea70a4e3574637e8b06418ee825b8820047"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.071240 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-787b588c76-v5mnn" event={"ID":"1f214ceb-c91a-4672-8711-9728a3f5e3f3","Type":"ContainerStarted","Data":"4ff69006dc4a5c42d13adad171a2a3135ef326af540050a2d29e2347ee5a8552"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.081808 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.091522 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerID="b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af" exitCode=0 Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.091753 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" event={"ID":"6ca18f47-7a09-4040-89f2-0b8c3f77a032","Type":"ContainerDied","Data":"b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.092995 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-787b588c76-v5mnn" podStartSLOduration=21.092985922 podStartE2EDuration="21.092985922s" podCreationTimestamp="2025-09-30 03:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:09.087847473 +0000 UTC m=+1116.261067447" watchObservedRunningTime="2025-09-30 03:13:09.092985922 +0000 UTC m=+1116.266205886" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.094741 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"173283f1-6dba-4c82-806f-0319e6ab1785","Type":"ContainerStarted","Data":"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.094761 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"173283f1-6dba-4c82-806f-0319e6ab1785","Type":"ContainerStarted","Data":"71cd0d346838a9b1b89375141df775cd925ca4489a3666c4169bb43435ce0935"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.109866 4744 generic.go:334] "Generic (PLEG): container finished" podID="6d629c05-1300-4fb5-8f08-211a133fffe8" containerID="872fde3c707408ad072c82898450c2626de8273f8aab39941386a7ca03ae0278" exitCode=0 Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.109949 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dpxjq" event={"ID":"6d629c05-1300-4fb5-8f08-211a133fffe8","Type":"ContainerDied","Data":"872fde3c707408ad072c82898450c2626de8273f8aab39941386a7ca03ae0278"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.119159 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.119293 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-6htj2" event={"ID":"dfa52756-287f-4dad-941c-c4d0c44d93f9","Type":"ContainerDied","Data":"fcc3989aedfaeec7eb5c75f87d1e70c72361019b79e33ff2ea00824055540d33"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.119381 4744 scope.go:117] "RemoveContainer" containerID="9c57a266a446a6d8d4ab35271916442848419ae0514e6fbf43b0ff2b96f49ad4" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.147740 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70a293af-94d1-411a-b43e-2a6cdf13fd07","Type":"ContainerStarted","Data":"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.166167 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerStarted","Data":"7b972a0943fad6eddc7d23af5bcaac6e173a5a1f4962be37ca805011abeb6f40"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.185540 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd595ddb6-2wvzb" event={"ID":"d8d97540-160a-4b25-9a0a-7ee3c27775f3","Type":"ContainerStarted","Data":"8c3941b8702915ec6b162e1e49b1276a4d9ca47032d0c110593402c9b0f7311e"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.185582 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd595ddb6-2wvzb" event={"ID":"d8d97540-160a-4b25-9a0a-7ee3c27775f3","Type":"ContainerStarted","Data":"a8cf03771592eff4cb375905c13d671076f830af09c4e80ec6fcced63bc535eb"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.185801 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.196096 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nm8l8" event={"ID":"08504742-967f-491a-a3ab-9ddcadb556c4","Type":"ContainerStarted","Data":"dffce1e13b41c3dbfe145e00caa36d90648e6f0d4f561b3a6a262e54cb9bb903"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.227572 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78db449746-kg7zl" event={"ID":"ff31735f-472e-4b3a-8d81-bc5c392aec09","Type":"ContainerStarted","Data":"cccf51dfe5068a04e5908dde3205641596493b1d15521b856a84d233664ea31f"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.227608 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78db449746-kg7zl" event={"ID":"ff31735f-472e-4b3a-8d81-bc5c392aec09","Type":"ContainerStarted","Data":"f60ab53badf914b2b68eaa6c177d6c2005afd0e9fae7af25c0dccbb0de6e7f46"} Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.238595 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-6htj2"] Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.251164 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5dd595ddb6-2wvzb" podStartSLOduration=3.251148949 podStartE2EDuration="3.251148949s" podCreationTimestamp="2025-09-30 03:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:09.240624453 +0000 UTC m=+1116.413844427" watchObservedRunningTime="2025-09-30 03:13:09.251148949 +0000 UTC m=+1116.424368923" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.252399 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-6htj2"] Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.292582 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nm8l8" podStartSLOduration=25.292566675 podStartE2EDuration="25.292566675s" podCreationTimestamp="2025-09-30 03:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:09.268660403 +0000 UTC m=+1116.441880377" watchObservedRunningTime="2025-09-30 03:13:09.292566675 +0000 UTC m=+1116.465786649" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.293524 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78db449746-kg7zl" podStartSLOduration=21.293519394 podStartE2EDuration="21.293519394s" podCreationTimestamp="2025-09-30 03:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:09.291868923 +0000 UTC m=+1116.465088897" watchObservedRunningTime="2025-09-30 03:13:09.293519394 +0000 UTC m=+1116.466739368" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.548505 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa52756-287f-4dad-941c-c4d0c44d93f9" path="/var/lib/kubelet/pods/dfa52756-287f-4dad-941c-c4d0c44d93f9/volumes" Sep 30 03:13:09 crc kubenswrapper[4744]: I0930 03:13:09.840902 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5578f9874f-7lb9c"] Sep 30 03:13:09 crc kubenswrapper[4744]: W0930 03:13:09.849641 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecbf3c72_f1cb_48fd_8823_3d3ae2040c86.slice/crio-0148889428fabf41f3c3864c9ebd6e401a1aca617f627793ad440285008e79d2 WatchSource:0}: Error finding container 0148889428fabf41f3c3864c9ebd6e401a1aca617f627793ad440285008e79d2: Status 404 returned error can't find the container with id 0148889428fabf41f3c3864c9ebd6e401a1aca617f627793ad440285008e79d2 Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.241468 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" event={"ID":"6ca18f47-7a09-4040-89f2-0b8c3f77a032","Type":"ContainerStarted","Data":"c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5"} Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.242522 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.244132 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5578f9874f-7lb9c" event={"ID":"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86","Type":"ContainerStarted","Data":"0148889428fabf41f3c3864c9ebd6e401a1aca617f627793ad440285008e79d2"} Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.252680 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70a293af-94d1-411a-b43e-2a6cdf13fd07","Type":"ContainerStarted","Data":"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0"} Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.252929 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-httpd" containerID="cri-o://b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0" gracePeriod=30 Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.252901 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-log" containerID="cri-o://14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73" gracePeriod=30 Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.270480 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" podStartSLOduration=4.270459815 podStartE2EDuration="4.270459815s" podCreationTimestamp="2025-09-30 03:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:10.257698859 +0000 UTC m=+1117.430918853" watchObservedRunningTime="2025-09-30 03:13:10.270459815 +0000 UTC m=+1117.443679789" Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.300340 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=29.300317731 podStartE2EDuration="29.300317731s" podCreationTimestamp="2025-09-30 03:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:10.295203183 +0000 UTC m=+1117.468423157" watchObservedRunningTime="2025-09-30 03:13:10.300317731 +0000 UTC m=+1117.473537705" Sep 30 03:13:10 crc kubenswrapper[4744]: I0930 03:13:10.866573 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dpxjq" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.018984 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-config-data\") pod \"6d629c05-1300-4fb5-8f08-211a133fffe8\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.019342 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lft7\" (UniqueName: \"kubernetes.io/projected/6d629c05-1300-4fb5-8f08-211a133fffe8-kube-api-access-8lft7\") pod \"6d629c05-1300-4fb5-8f08-211a133fffe8\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.019387 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-combined-ca-bundle\") pod \"6d629c05-1300-4fb5-8f08-211a133fffe8\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.019448 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d629c05-1300-4fb5-8f08-211a133fffe8-logs\") pod \"6d629c05-1300-4fb5-8f08-211a133fffe8\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.019527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-scripts\") pod \"6d629c05-1300-4fb5-8f08-211a133fffe8\" (UID: \"6d629c05-1300-4fb5-8f08-211a133fffe8\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.022854 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d629c05-1300-4fb5-8f08-211a133fffe8-logs" (OuterVolumeSpecName: "logs") pod "6d629c05-1300-4fb5-8f08-211a133fffe8" (UID: "6d629c05-1300-4fb5-8f08-211a133fffe8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.023598 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-scripts" (OuterVolumeSpecName: "scripts") pod "6d629c05-1300-4fb5-8f08-211a133fffe8" (UID: "6d629c05-1300-4fb5-8f08-211a133fffe8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.029020 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d629c05-1300-4fb5-8f08-211a133fffe8-kube-api-access-8lft7" (OuterVolumeSpecName: "kube-api-access-8lft7") pod "6d629c05-1300-4fb5-8f08-211a133fffe8" (UID: "6d629c05-1300-4fb5-8f08-211a133fffe8"). InnerVolumeSpecName "kube-api-access-8lft7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.051451 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d629c05-1300-4fb5-8f08-211a133fffe8" (UID: "6d629c05-1300-4fb5-8f08-211a133fffe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.121787 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lft7\" (UniqueName: \"kubernetes.io/projected/6d629c05-1300-4fb5-8f08-211a133fffe8-kube-api-access-8lft7\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.121820 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.121829 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d629c05-1300-4fb5-8f08-211a133fffe8-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.121837 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.122728 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-config-data" (OuterVolumeSpecName: "config-data") pod "6d629c05-1300-4fb5-8f08-211a133fffe8" (UID: "6d629c05-1300-4fb5-8f08-211a133fffe8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.182067 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.227466 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d629c05-1300-4fb5-8f08-211a133fffe8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.249225 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7ddc58d856-kwfp8"] Sep 30 03:13:11 crc kubenswrapper[4744]: E0930 03:13:11.253997 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-log" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.254407 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-log" Sep 30 03:13:11 crc kubenswrapper[4744]: E0930 03:13:11.254470 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-httpd" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.254477 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-httpd" Sep 30 03:13:11 crc kubenswrapper[4744]: E0930 03:13:11.254486 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d629c05-1300-4fb5-8f08-211a133fffe8" containerName="placement-db-sync" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.254492 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d629c05-1300-4fb5-8f08-211a133fffe8" containerName="placement-db-sync" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.268954 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d629c05-1300-4fb5-8f08-211a133fffe8" containerName="placement-db-sync" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.269009 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-httpd" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.269023 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerName="glance-log" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.269930 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7ddc58d856-kwfp8"] Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.270013 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.272516 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.272604 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.299882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"173283f1-6dba-4c82-806f-0319e6ab1785","Type":"ContainerStarted","Data":"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe"} Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.300161 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-log" containerID="cri-o://133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32" gracePeriod=30 Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.300601 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-httpd" containerID="cri-o://e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe" gracePeriod=30 Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.321861 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5578f9874f-7lb9c" event={"ID":"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86","Type":"ContainerStarted","Data":"313a849deea4366a3be75c204cb7e091d5b5048cf7ac4f596fe20e0604d43e67"} Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.321922 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328086 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-config-data\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328244 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-scripts\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328312 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328402 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-ceph\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328459 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpz97\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-kube-api-access-xpz97\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328484 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-logs\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328504 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-httpd-run\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.328544 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-combined-ca-bundle\") pod \"70a293af-94d1-411a-b43e-2a6cdf13fd07\" (UID: \"70a293af-94d1-411a-b43e-2a6cdf13fd07\") " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.330905 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-logs" (OuterVolumeSpecName: "logs") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.334664 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=31.334643821 podStartE2EDuration="31.334643821s" podCreationTimestamp="2025-09-30 03:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:11.319720779 +0000 UTC m=+1118.492940783" watchObservedRunningTime="2025-09-30 03:13:11.334643821 +0000 UTC m=+1118.507863785" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.334747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dpxjq" event={"ID":"6d629c05-1300-4fb5-8f08-211a133fffe8","Type":"ContainerDied","Data":"ade53214afd893e70d03b78d08eee4e09d58d22efbfdd4d4e2ff34eae53284ec"} Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.334788 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade53214afd893e70d03b78d08eee4e09d58d22efbfdd4d4e2ff34eae53284ec" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.334867 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dpxjq" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.336011 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.338704 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-kube-api-access-xpz97" (OuterVolumeSpecName: "kube-api-access-xpz97") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "kube-api-access-xpz97". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.344901 4744 generic.go:334] "Generic (PLEG): container finished" podID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerID="b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0" exitCode=0 Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.344949 4744 generic.go:334] "Generic (PLEG): container finished" podID="70a293af-94d1-411a-b43e-2a6cdf13fd07" containerID="14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73" exitCode=143 Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.346158 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.346323 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70a293af-94d1-411a-b43e-2a6cdf13fd07","Type":"ContainerDied","Data":"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0"} Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.346425 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70a293af-94d1-411a-b43e-2a6cdf13fd07","Type":"ContainerDied","Data":"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73"} Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.346437 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"70a293af-94d1-411a-b43e-2a6cdf13fd07","Type":"ContainerDied","Data":"1745b4330ca4766c148b342d6a79c602c8839a6e9e904960e7c5a8c9458aba55"} Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.346455 4744 scope.go:117] "RemoveContainer" containerID="b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.354947 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-scripts" (OuterVolumeSpecName: "scripts") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.355211 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-ceph" (OuterVolumeSpecName: "ceph") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.360157 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5578f9874f-7lb9c" podStartSLOduration=3.360139492 podStartE2EDuration="3.360139492s" podCreationTimestamp="2025-09-30 03:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:11.343294561 +0000 UTC m=+1118.516514535" watchObservedRunningTime="2025-09-30 03:13:11.360139492 +0000 UTC m=+1118.533359466" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.369095 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.410509 4744 scope.go:117] "RemoveContainer" containerID="14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.430307 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-combined-ca-bundle\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.430398 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-internal-tls-certs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.430446 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-config-data\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.430474 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-public-tls-certs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431268 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-logs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431294 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-scripts\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431309 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ghr\" (UniqueName: \"kubernetes.io/projected/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-kube-api-access-l4ghr\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431375 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431394 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431404 4744 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431413 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpz97\" (UniqueName: \"kubernetes.io/projected/70a293af-94d1-411a-b43e-2a6cdf13fd07-kube-api-access-xpz97\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431422 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.431433 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70a293af-94d1-411a-b43e-2a6cdf13fd07-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.438563 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.447523 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-config-data" (OuterVolumeSpecName: "config-data") pod "70a293af-94d1-411a-b43e-2a6cdf13fd07" (UID: "70a293af-94d1-411a-b43e-2a6cdf13fd07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.461199 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.469028 4744 scope.go:117] "RemoveContainer" containerID="b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0" Sep 30 03:13:11 crc kubenswrapper[4744]: E0930 03:13:11.470524 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0\": container with ID starting with b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0 not found: ID does not exist" containerID="b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.470557 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0"} err="failed to get container status \"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0\": rpc error: code = NotFound desc = could not find container \"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0\": container with ID starting with b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0 not found: ID does not exist" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.470580 4744 scope.go:117] "RemoveContainer" containerID="14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73" Sep 30 03:13:11 crc kubenswrapper[4744]: E0930 03:13:11.470958 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73\": container with ID starting with 14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73 not found: ID does not exist" containerID="14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.470997 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73"} err="failed to get container status \"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73\": rpc error: code = NotFound desc = could not find container \"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73\": container with ID starting with 14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73 not found: ID does not exist" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.471011 4744 scope.go:117] "RemoveContainer" containerID="b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.471523 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0"} err="failed to get container status \"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0\": rpc error: code = NotFound desc = could not find container \"b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0\": container with ID starting with b921a81181eb150655fdc5e6f07444f77a7d6c08847260b3ceb0aad46195eef0 not found: ID does not exist" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.471544 4744 scope.go:117] "RemoveContainer" containerID="14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.471706 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73"} err="failed to get container status \"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73\": rpc error: code = NotFound desc = could not find container \"14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73\": container with ID starting with 14993de9c910782a3237d53693e05014488fdbdc7e17537d5efff7a1996c1f73 not found: ID does not exist" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.546549 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-combined-ca-bundle\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.546909 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-internal-tls-certs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.547047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-config-data\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.547107 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-public-tls-certs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.547287 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-logs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.547327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-scripts\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.547353 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ghr\" (UniqueName: \"kubernetes.io/projected/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-kube-api-access-l4ghr\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.553060 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.557918 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.557948 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70a293af-94d1-411a-b43e-2a6cdf13fd07-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.559145 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-scripts\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.559295 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-config-data\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.560007 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-logs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.571337 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-combined-ca-bundle\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.572220 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-public-tls-certs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.579656 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4ghr\" (UniqueName: \"kubernetes.io/projected/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-kube-api-access-l4ghr\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.585980 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/825a9fa1-9368-48f2-9baa-1b8390d0cd3a-internal-tls-certs\") pod \"placement-7ddc58d856-kwfp8\" (UID: \"825a9fa1-9368-48f2-9baa-1b8390d0cd3a\") " pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.586401 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.760218 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.773234 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.810206 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.811627 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.818727 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.819026 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.819789 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.966929 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.966987 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.967048 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.967076 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.967116 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-logs\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.967136 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.967167 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.967227 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9xlj\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-kube-api-access-w9xlj\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:11 crc kubenswrapper[4744]: I0930 03:13:11.967268 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.068913 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.068974 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.068994 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-logs\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.069013 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.069047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.069072 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9xlj\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-kube-api-access-w9xlj\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.069096 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.069148 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.069182 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.069434 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.073766 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-logs\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.074107 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.074928 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.074991 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.075214 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.077341 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.089832 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.102110 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9xlj\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-kube-api-access-w9xlj\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.116532 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.146132 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.146523 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7ddc58d856-kwfp8"] Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.227706 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377195 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-ceph\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377269 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-combined-ca-bundle\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377304 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-scripts\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377472 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-httpd-run\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377511 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmqdn\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-kube-api-access-vmqdn\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-logs\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377552 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-config-data\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.377584 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"173283f1-6dba-4c82-806f-0319e6ab1785\" (UID: \"173283f1-6dba-4c82-806f-0319e6ab1785\") " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.378805 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.383007 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-ceph" (OuterVolumeSpecName: "ceph") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.385339 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.386043 4744 generic.go:334] "Generic (PLEG): container finished" podID="08504742-967f-491a-a3ab-9ddcadb556c4" containerID="dffce1e13b41c3dbfe145e00caa36d90648e6f0d4f561b3a6a262e54cb9bb903" exitCode=0 Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.386128 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-scripts" (OuterVolumeSpecName: "scripts") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.386248 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nm8l8" event={"ID":"08504742-967f-491a-a3ab-9ddcadb556c4","Type":"ContainerDied","Data":"dffce1e13b41c3dbfe145e00caa36d90648e6f0d4f561b3a6a262e54cb9bb903"} Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.386726 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-logs" (OuterVolumeSpecName: "logs") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.389622 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-kube-api-access-vmqdn" (OuterVolumeSpecName: "kube-api-access-vmqdn") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "kube-api-access-vmqdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.394587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ddc58d856-kwfp8" event={"ID":"825a9fa1-9368-48f2-9baa-1b8390d0cd3a","Type":"ContainerStarted","Data":"c4467849034742ada18a9012ed1cd687c0ec5f74b403f60cb9750176108e66e2"} Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.402320 4744 generic.go:334] "Generic (PLEG): container finished" podID="173283f1-6dba-4c82-806f-0319e6ab1785" containerID="e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe" exitCode=0 Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.402393 4744 generic.go:334] "Generic (PLEG): container finished" podID="173283f1-6dba-4c82-806f-0319e6ab1785" containerID="133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32" exitCode=143 Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.402445 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.402493 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"173283f1-6dba-4c82-806f-0319e6ab1785","Type":"ContainerDied","Data":"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe"} Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.402540 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"173283f1-6dba-4c82-806f-0319e6ab1785","Type":"ContainerDied","Data":"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32"} Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.402551 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"173283f1-6dba-4c82-806f-0319e6ab1785","Type":"ContainerDied","Data":"71cd0d346838a9b1b89375141df775cd925ca4489a3666c4169bb43435ce0935"} Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.402570 4744 scope.go:117] "RemoveContainer" containerID="e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.416331 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5578f9874f-7lb9c" event={"ID":"ecbf3c72-f1cb-48fd-8823-3d3ae2040c86","Type":"ContainerStarted","Data":"5d125cbf0b1734b73aa4ce9af66d06c2ec08fd9d6dfb4dc81cae299c217050ea"} Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.440280 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.452887 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-config-data" (OuterVolumeSpecName: "config-data") pod "173283f1-6dba-4c82-806f-0319e6ab1785" (UID: "173283f1-6dba-4c82-806f-0319e6ab1785"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.461031 4744 scope.go:117] "RemoveContainer" containerID="133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.479643 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.479992 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmqdn\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-kube-api-access-vmqdn\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.480009 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/173283f1-6dba-4c82-806f-0319e6ab1785-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.480021 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.480070 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.480084 4744 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/173283f1-6dba-4c82-806f-0319e6ab1785-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.480095 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.480105 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/173283f1-6dba-4c82-806f-0319e6ab1785-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.498510 4744 scope.go:117] "RemoveContainer" containerID="e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe" Sep 30 03:13:12 crc kubenswrapper[4744]: E0930 03:13:12.504549 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe\": container with ID starting with e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe not found: ID does not exist" containerID="e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.504584 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe"} err="failed to get container status \"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe\": rpc error: code = NotFound desc = could not find container \"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe\": container with ID starting with e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe not found: ID does not exist" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.504608 4744 scope.go:117] "RemoveContainer" containerID="133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32" Sep 30 03:13:12 crc kubenswrapper[4744]: E0930 03:13:12.504927 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32\": container with ID starting with 133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32 not found: ID does not exist" containerID="133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.504973 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32"} err="failed to get container status \"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32\": rpc error: code = NotFound desc = could not find container \"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32\": container with ID starting with 133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32 not found: ID does not exist" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.505001 4744 scope.go:117] "RemoveContainer" containerID="e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.506795 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe"} err="failed to get container status \"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe\": rpc error: code = NotFound desc = could not find container \"e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe\": container with ID starting with e2fe2c066fcb09f850288ba080c7d95ed44d641ff5ec8cb5edf87b4e38d19cbe not found: ID does not exist" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.506831 4744 scope.go:117] "RemoveContainer" containerID="133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.507165 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32"} err="failed to get container status \"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32\": rpc error: code = NotFound desc = could not find container \"133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32\": container with ID starting with 133e90c4543ea92de6693de6f195c00a9c9ecc559a941fc7f0f7ff19c0a3ab32 not found: ID does not exist" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.526278 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.582219 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.747665 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.753764 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.779583 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:13:12 crc kubenswrapper[4744]: E0930 03:13:12.780874 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-httpd" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.780896 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-httpd" Sep 30 03:13:12 crc kubenswrapper[4744]: E0930 03:13:12.780906 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-log" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.780912 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-log" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.781099 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-log" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.781120 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" containerName="glance-httpd" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.782087 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.785222 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.785655 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.785841 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.792689 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:13:12 crc kubenswrapper[4744]: W0930 03:13:12.804850 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2daed0a_2a52_458f_a872_1f7b875e1a39.slice/crio-1e8ecd2101d12b8a8a6e354f29903c43f5a6a84a7a8dc0dcd250a806576a7859 WatchSource:0}: Error finding container 1e8ecd2101d12b8a8a6e354f29903c43f5a6a84a7a8dc0dcd250a806576a7859: Status 404 returned error can't find the container with id 1e8ecd2101d12b8a8a6e354f29903c43f5a6a84a7a8dc0dcd250a806576a7859 Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888132 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-ceph\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888186 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-scripts\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888221 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-logs\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888238 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkr6s\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-kube-api-access-qkr6s\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888299 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888317 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888413 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.888437 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-config-data\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990520 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-config-data\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990620 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-ceph\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-scripts\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990684 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkr6s\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-kube-api-access-qkr6s\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-logs\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990749 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990767 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.990810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.991028 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.991735 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.992320 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-logs\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.998220 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-scripts\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.998488 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-config-data\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:12 crc kubenswrapper[4744]: I0930 03:13:12.999179 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.001700 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-ceph\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.010867 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.034226 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkr6s\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-kube-api-access-qkr6s\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.051245 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " pod="openstack/glance-default-external-api-0" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.151740 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.439297 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ddc58d856-kwfp8" event={"ID":"825a9fa1-9368-48f2-9baa-1b8390d0cd3a","Type":"ContainerStarted","Data":"9900247384c8ac0e24a1ae6ceb191a20af86123c20892aa76f9a164e62a4ebb1"} Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.439339 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ddc58d856-kwfp8" event={"ID":"825a9fa1-9368-48f2-9baa-1b8390d0cd3a","Type":"ContainerStarted","Data":"c8da428854586f063459973fc9eee2f2ee5065d6694d6171282e3d4664a1c1c6"} Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.440361 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.440401 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.446117 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2daed0a-2a52-458f-a872-1f7b875e1a39","Type":"ContainerStarted","Data":"1e8ecd2101d12b8a8a6e354f29903c43f5a6a84a7a8dc0dcd250a806576a7859"} Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.466398 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7ddc58d856-kwfp8" podStartSLOduration=2.466357711 podStartE2EDuration="2.466357711s" podCreationTimestamp="2025-09-30 03:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:13.458223018 +0000 UTC m=+1120.631442982" watchObservedRunningTime="2025-09-30 03:13:13.466357711 +0000 UTC m=+1120.639577695" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.551826 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173283f1-6dba-4c82-806f-0319e6ab1785" path="/var/lib/kubelet/pods/173283f1-6dba-4c82-806f-0319e6ab1785/volumes" Sep 30 03:13:13 crc kubenswrapper[4744]: I0930 03:13:13.552690 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a293af-94d1-411a-b43e-2a6cdf13fd07" path="/var/lib/kubelet/pods/70a293af-94d1-411a-b43e-2a6cdf13fd07/volumes" Sep 30 03:13:14 crc kubenswrapper[4744]: I0930 03:13:14.865784 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.040546 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-credential-keys\") pod \"08504742-967f-491a-a3ab-9ddcadb556c4\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.040590 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-combined-ca-bundle\") pod \"08504742-967f-491a-a3ab-9ddcadb556c4\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.040648 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-scripts\") pod \"08504742-967f-491a-a3ab-9ddcadb556c4\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.040698 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-config-data\") pod \"08504742-967f-491a-a3ab-9ddcadb556c4\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.040736 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-fernet-keys\") pod \"08504742-967f-491a-a3ab-9ddcadb556c4\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.040805 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwxbg\" (UniqueName: \"kubernetes.io/projected/08504742-967f-491a-a3ab-9ddcadb556c4-kube-api-access-xwxbg\") pod \"08504742-967f-491a-a3ab-9ddcadb556c4\" (UID: \"08504742-967f-491a-a3ab-9ddcadb556c4\") " Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.050250 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08504742-967f-491a-a3ab-9ddcadb556c4-kube-api-access-xwxbg" (OuterVolumeSpecName: "kube-api-access-xwxbg") pod "08504742-967f-491a-a3ab-9ddcadb556c4" (UID: "08504742-967f-491a-a3ab-9ddcadb556c4"). InnerVolumeSpecName "kube-api-access-xwxbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.055489 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "08504742-967f-491a-a3ab-9ddcadb556c4" (UID: "08504742-967f-491a-a3ab-9ddcadb556c4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.056620 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "08504742-967f-491a-a3ab-9ddcadb556c4" (UID: "08504742-967f-491a-a3ab-9ddcadb556c4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.065323 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-scripts" (OuterVolumeSpecName: "scripts") pod "08504742-967f-491a-a3ab-9ddcadb556c4" (UID: "08504742-967f-491a-a3ab-9ddcadb556c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.094480 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08504742-967f-491a-a3ab-9ddcadb556c4" (UID: "08504742-967f-491a-a3ab-9ddcadb556c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.101571 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-config-data" (OuterVolumeSpecName: "config-data") pod "08504742-967f-491a-a3ab-9ddcadb556c4" (UID: "08504742-967f-491a-a3ab-9ddcadb556c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.144270 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwxbg\" (UniqueName: \"kubernetes.io/projected/08504742-967f-491a-a3ab-9ddcadb556c4-kube-api-access-xwxbg\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.144305 4744 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.144315 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.144326 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.144335 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.144345 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08504742-967f-491a-a3ab-9ddcadb556c4-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.466734 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nm8l8" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.466750 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nm8l8" event={"ID":"08504742-967f-491a-a3ab-9ddcadb556c4","Type":"ContainerDied","Data":"25204fdfe511df50a37438f552d19d51e47c05d5b9dab9c1e972070080754bcc"} Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.466802 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25204fdfe511df50a37438f552d19d51e47c05d5b9dab9c1e972070080754bcc" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.953457 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-59ddc4db88-d9q99"] Sep 30 03:13:15 crc kubenswrapper[4744]: E0930 03:13:15.953958 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08504742-967f-491a-a3ab-9ddcadb556c4" containerName="keystone-bootstrap" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.953971 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="08504742-967f-491a-a3ab-9ddcadb556c4" containerName="keystone-bootstrap" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.954147 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="08504742-967f-491a-a3ab-9ddcadb556c4" containerName="keystone-bootstrap" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.954781 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.958856 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-internal-tls-certs\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.958891 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-combined-ca-bundle\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.958944 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-scripts\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.958985 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-public-tls-certs\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.959008 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-fernet-keys\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.959039 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-config-data\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.959077 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwbtr\" (UniqueName: \"kubernetes.io/projected/02356cb4-2497-483a-9742-acd6b9080dc2-kube-api-access-hwbtr\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.959096 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-credential-keys\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.961034 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.961140 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.961311 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.961341 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.961496 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.961522 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g28pb" Sep 30 03:13:15 crc kubenswrapper[4744]: I0930 03:13:15.963866 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59ddc4db88-d9q99"] Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060328 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-public-tls-certs\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060425 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-fernet-keys\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060465 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-config-data\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060504 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwbtr\" (UniqueName: \"kubernetes.io/projected/02356cb4-2497-483a-9742-acd6b9080dc2-kube-api-access-hwbtr\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060527 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-credential-keys\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-internal-tls-certs\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060580 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-combined-ca-bundle\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.060624 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-scripts\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.073923 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-scripts\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.074909 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-config-data\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.075164 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-public-tls-certs\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.077064 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-internal-tls-certs\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.082757 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-credential-keys\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.086919 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-fernet-keys\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.092885 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02356cb4-2497-483a-9742-acd6b9080dc2-combined-ca-bundle\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.097146 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwbtr\" (UniqueName: \"kubernetes.io/projected/02356cb4-2497-483a-9742-acd6b9080dc2-kube-api-access-hwbtr\") pod \"keystone-59ddc4db88-d9q99\" (UID: \"02356cb4-2497-483a-9742-acd6b9080dc2\") " pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.181913 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.327610 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.347042 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.608744 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.668437 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-gnvkr"] Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.668632 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerName="dnsmasq-dns" containerID="cri-o://4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3" gracePeriod=10 Sep 30 03:13:16 crc kubenswrapper[4744]: I0930 03:13:16.733587 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.154313 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59ddc4db88-d9q99"] Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.360717 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.404688 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.487538 4744 generic.go:334] "Generic (PLEG): container finished" podID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerID="4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3" exitCode=0 Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.487635 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" event={"ID":"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315","Type":"ContainerDied","Data":"4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3"} Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.487661 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" event={"ID":"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315","Type":"ContainerDied","Data":"d1267098634b6d137cb718556050380e39bed13a10e57569a79aec5fb9e3cab7"} Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.487676 4744 scope.go:117] "RemoveContainer" containerID="4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.487826 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-gnvkr" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.489229 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-sb\") pod \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.489536 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-config\") pod \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.490179 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-nb\") pod \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.490231 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-svc\") pod \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.490424 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clw78\" (UniqueName: \"kubernetes.io/projected/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-kube-api-access-clw78\") pod \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.490454 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-swift-storage-0\") pod \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\" (UID: \"bb2c1a71-47b6-4b19-bb2a-ba36eafd1315\") " Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.496120 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d90c9655-5af0-4978-8c33-23be71d00047","Type":"ContainerStarted","Data":"8be5408d7809b20c1c8b035b6f38cb0ec660fe0e50045a15ea836391e097be46"} Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.503490 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59ddc4db88-d9q99" event={"ID":"02356cb4-2497-483a-9742-acd6b9080dc2","Type":"ContainerStarted","Data":"c2c6dad009087a3556e2adb0fe417a0fd00399b94a6675bec3c02ec615a85447"} Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.518626 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-kube-api-access-clw78" (OuterVolumeSpecName: "kube-api-access-clw78") pod "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" (UID: "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315"). InnerVolumeSpecName "kube-api-access-clw78". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.538658 4744 scope.go:117] "RemoveContainer" containerID="6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.539014 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-75phb" event={"ID":"b118b5fa-982e-4bd6-a6dc-5d2015b3b399","Type":"ContainerStarted","Data":"71e2cf222297c03ad431b910faa1034a2782bd3476c1ee1489728f137ab7ff18"} Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.563803 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-75phb" podStartSLOduration=3.21080627 podStartE2EDuration="41.563784686s" podCreationTimestamp="2025-09-30 03:12:36 +0000 UTC" firstStartedPulling="2025-09-30 03:12:38.498120089 +0000 UTC m=+1085.671340053" lastFinishedPulling="2025-09-30 03:13:16.851098495 +0000 UTC m=+1124.024318469" observedRunningTime="2025-09-30 03:13:17.557953026 +0000 UTC m=+1124.731173010" watchObservedRunningTime="2025-09-30 03:13:17.563784686 +0000 UTC m=+1124.737004660" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.581353 4744 scope.go:117] "RemoveContainer" containerID="4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3" Sep 30 03:13:17 crc kubenswrapper[4744]: E0930 03:13:17.582618 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3\": container with ID starting with 4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3 not found: ID does not exist" containerID="4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.582650 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3"} err="failed to get container status \"4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3\": rpc error: code = NotFound desc = could not find container \"4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3\": container with ID starting with 4889da92eb08efb6680757fc4a3147b9889a98661c29eeef144676a51cc435b3 not found: ID does not exist" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.582669 4744 scope.go:117] "RemoveContainer" containerID="6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124" Sep 30 03:13:17 crc kubenswrapper[4744]: E0930 03:13:17.586426 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124\": container with ID starting with 6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124 not found: ID does not exist" containerID="6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.586449 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124"} err="failed to get container status \"6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124\": rpc error: code = NotFound desc = could not find container \"6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124\": container with ID starting with 6840d508316323a5e5f13c26d4fd92980a65fee59c5feb010575a11432354124 not found: ID does not exist" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.596459 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clw78\" (UniqueName: \"kubernetes.io/projected/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-kube-api-access-clw78\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.781602 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" (UID: "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.799555 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.894883 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" (UID: "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.901239 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-config" (OuterVolumeSpecName: "config") pod "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" (UID: "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.902735 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.902765 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.904276 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" (UID: "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:17 crc kubenswrapper[4744]: I0930 03:13:17.945773 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" (UID: "bb2c1a71-47b6-4b19-bb2a-ba36eafd1315"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.004643 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.004677 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.199777 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-gnvkr"] Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.207249 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-gnvkr"] Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.530648 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59ddc4db88-d9q99" event={"ID":"02356cb4-2497-483a-9742-acd6b9080dc2","Type":"ContainerStarted","Data":"b519c60cb3f2b5fca0cd0686f1eeacac9efd153e8096de4945712658829eea93"} Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.531987 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.532870 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pwqjw" event={"ID":"a24c42a2-4afa-4c32-ba87-18251fd1345a","Type":"ContainerStarted","Data":"869acb7ec54dcc781e8364e30d14cdf03deb15f640b00501fbab0d05595a4f44"} Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.536847 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerStarted","Data":"0b7f558a23c54f3be65fae1c6af58440967097dbff23cb59f45d386b7497b879"} Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.542916 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2daed0a-2a52-458f-a872-1f7b875e1a39","Type":"ContainerStarted","Data":"847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec"} Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.550651 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-59ddc4db88-d9q99" podStartSLOduration=3.550638455 podStartE2EDuration="3.550638455s" podCreationTimestamp="2025-09-30 03:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:18.545758973 +0000 UTC m=+1125.718978947" watchObservedRunningTime="2025-09-30 03:13:18.550638455 +0000 UTC m=+1125.723858419" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.557568 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d90c9655-5af0-4978-8c33-23be71d00047","Type":"ContainerStarted","Data":"3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883"} Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.563145 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-pwqjw" podStartSLOduration=4.233109608 podStartE2EDuration="42.563125562s" podCreationTimestamp="2025-09-30 03:12:36 +0000 UTC" firstStartedPulling="2025-09-30 03:12:38.644591833 +0000 UTC m=+1085.817811797" lastFinishedPulling="2025-09-30 03:13:16.974607777 +0000 UTC m=+1124.147827751" observedRunningTime="2025-09-30 03:13:18.560894463 +0000 UTC m=+1125.734114437" watchObservedRunningTime="2025-09-30 03:13:18.563125562 +0000 UTC m=+1125.736345536" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.745400 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.745649 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:13:18 crc kubenswrapper[4744]: I0930 03:13:18.746727 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.115651 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.115827 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.117932 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78db449746-kg7zl" podUID="ff31735f-472e-4b3a-8d81-bc5c392aec09" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.517447 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" path="/var/lib/kubelet/pods/bb2c1a71-47b6-4b19-bb2a-ba36eafd1315/volumes" Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.568790 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bvn26" event={"ID":"72b19763-eb29-45ca-9431-8791543dee83","Type":"ContainerStarted","Data":"3b3026f2af2b2955aa87eab2eb9210451af07b8d9b87fdbe9ffc790294c4aedf"} Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.576452 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2daed0a-2a52-458f-a872-1f7b875e1a39","Type":"ContainerStarted","Data":"4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2"} Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.589244 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bvn26" podStartSLOduration=4.106881132 podStartE2EDuration="43.589224678s" podCreationTimestamp="2025-09-30 03:12:36 +0000 UTC" firstStartedPulling="2025-09-30 03:12:38.524788986 +0000 UTC m=+1085.698008960" lastFinishedPulling="2025-09-30 03:13:18.007132542 +0000 UTC m=+1125.180352506" observedRunningTime="2025-09-30 03:13:19.583666345 +0000 UTC m=+1126.756886319" watchObservedRunningTime="2025-09-30 03:13:19.589224678 +0000 UTC m=+1126.762444652" Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.592236 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d90c9655-5af0-4978-8c33-23be71d00047","Type":"ContainerStarted","Data":"5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade"} Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.616559 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.616544115 podStartE2EDuration="8.616544115s" podCreationTimestamp="2025-09-30 03:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:19.613629725 +0000 UTC m=+1126.786849699" watchObservedRunningTime="2025-09-30 03:13:19.616544115 +0000 UTC m=+1126.789764089" Sep 30 03:13:19 crc kubenswrapper[4744]: I0930 03:13:19.638358 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.638343972 podStartE2EDuration="7.638343972s" podCreationTimestamp="2025-09-30 03:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:19.63635267 +0000 UTC m=+1126.809572644" watchObservedRunningTime="2025-09-30 03:13:19.638343972 +0000 UTC m=+1126.811563946" Sep 30 03:13:20 crc kubenswrapper[4744]: I0930 03:13:20.605143 4744 generic.go:334] "Generic (PLEG): container finished" podID="b118b5fa-982e-4bd6-a6dc-5d2015b3b399" containerID="71e2cf222297c03ad431b910faa1034a2782bd3476c1ee1489728f137ab7ff18" exitCode=0 Sep 30 03:13:20 crc kubenswrapper[4744]: I0930 03:13:20.605463 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-75phb" event={"ID":"b118b5fa-982e-4bd6-a6dc-5d2015b3b399","Type":"ContainerDied","Data":"71e2cf222297c03ad431b910faa1034a2782bd3476c1ee1489728f137ab7ff18"} Sep 30 03:13:22 crc kubenswrapper[4744]: I0930 03:13:22.146776 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:22 crc kubenswrapper[4744]: I0930 03:13:22.147097 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:22 crc kubenswrapper[4744]: I0930 03:13:22.185600 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:22 crc kubenswrapper[4744]: I0930 03:13:22.201502 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:22 crc kubenswrapper[4744]: I0930 03:13:22.634853 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:22 crc kubenswrapper[4744]: I0930 03:13:22.634909 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.153381 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.153455 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.198928 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.207643 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.647010 4744 generic.go:334] "Generic (PLEG): container finished" podID="72b19763-eb29-45ca-9431-8791543dee83" containerID="3b3026f2af2b2955aa87eab2eb9210451af07b8d9b87fdbe9ffc790294c4aedf" exitCode=0 Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.647105 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bvn26" event={"ID":"72b19763-eb29-45ca-9431-8791543dee83","Type":"ContainerDied","Data":"3b3026f2af2b2955aa87eab2eb9210451af07b8d9b87fdbe9ffc790294c4aedf"} Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.647821 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 03:13:23 crc kubenswrapper[4744]: I0930 03:13:23.647854 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 03:13:24 crc kubenswrapper[4744]: I0930 03:13:24.951951 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:24 crc kubenswrapper[4744]: I0930 03:13:24.952267 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 03:13:24 crc kubenswrapper[4744]: I0930 03:13:24.955256 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 03:13:25 crc kubenswrapper[4744]: I0930 03:13:25.572374 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 03:13:25 crc kubenswrapper[4744]: I0930 03:13:25.576953 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.936348 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-75phb" Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.947774 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bvn26" Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.997200 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-combined-ca-bundle\") pod \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998110 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-db-sync-config-data\") pod \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998186 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b19763-eb29-45ca-9431-8791543dee83-etc-machine-id\") pod \"72b19763-eb29-45ca-9431-8791543dee83\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998221 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-scripts\") pod \"72b19763-eb29-45ca-9431-8791543dee83\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998260 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-kube-api-access-bnrg4\") pod \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\" (UID: \"b118b5fa-982e-4bd6-a6dc-5d2015b3b399\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998291 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-db-sync-config-data\") pod \"72b19763-eb29-45ca-9431-8791543dee83\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998326 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-config-data\") pod \"72b19763-eb29-45ca-9431-8791543dee83\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998371 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2fcj\" (UniqueName: \"kubernetes.io/projected/72b19763-eb29-45ca-9431-8791543dee83-kube-api-access-h2fcj\") pod \"72b19763-eb29-45ca-9431-8791543dee83\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.998431 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-combined-ca-bundle\") pod \"72b19763-eb29-45ca-9431-8791543dee83\" (UID: \"72b19763-eb29-45ca-9431-8791543dee83\") " Sep 30 03:13:26 crc kubenswrapper[4744]: I0930 03:13:26.999093 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72b19763-eb29-45ca-9431-8791543dee83-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "72b19763-eb29-45ca-9431-8791543dee83" (UID: "72b19763-eb29-45ca-9431-8791543dee83"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.003976 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-kube-api-access-bnrg4" (OuterVolumeSpecName: "kube-api-access-bnrg4") pod "b118b5fa-982e-4bd6-a6dc-5d2015b3b399" (UID: "b118b5fa-982e-4bd6-a6dc-5d2015b3b399"). InnerVolumeSpecName "kube-api-access-bnrg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.017628 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b118b5fa-982e-4bd6-a6dc-5d2015b3b399" (UID: "b118b5fa-982e-4bd6-a6dc-5d2015b3b399"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.019835 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72b19763-eb29-45ca-9431-8791543dee83" (UID: "72b19763-eb29-45ca-9431-8791543dee83"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.020528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b19763-eb29-45ca-9431-8791543dee83-kube-api-access-h2fcj" (OuterVolumeSpecName: "kube-api-access-h2fcj") pod "72b19763-eb29-45ca-9431-8791543dee83" (UID: "72b19763-eb29-45ca-9431-8791543dee83"). InnerVolumeSpecName "kube-api-access-h2fcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.024638 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-scripts" (OuterVolumeSpecName: "scripts") pod "72b19763-eb29-45ca-9431-8791543dee83" (UID: "72b19763-eb29-45ca-9431-8791543dee83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.034639 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72b19763-eb29-45ca-9431-8791543dee83" (UID: "72b19763-eb29-45ca-9431-8791543dee83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.038725 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b118b5fa-982e-4bd6-a6dc-5d2015b3b399" (UID: "b118b5fa-982e-4bd6-a6dc-5d2015b3b399"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.065249 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-config-data" (OuterVolumeSpecName: "config-data") pod "72b19763-eb29-45ca-9431-8791543dee83" (UID: "72b19763-eb29-45ca-9431-8791543dee83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100358 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnrg4\" (UniqueName: \"kubernetes.io/projected/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-kube-api-access-bnrg4\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100400 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100411 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100419 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2fcj\" (UniqueName: \"kubernetes.io/projected/72b19763-eb29-45ca-9431-8791543dee83-kube-api-access-h2fcj\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100428 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100439 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100449 4744 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b118b5fa-982e-4bd6-a6dc-5d2015b3b399-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100458 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72b19763-eb29-45ca-9431-8791543dee83-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.100468 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b19763-eb29-45ca-9431-8791543dee83-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:27 crc kubenswrapper[4744]: E0930 03:13:27.623685 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.696780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-75phb" event={"ID":"b118b5fa-982e-4bd6-a6dc-5d2015b3b399","Type":"ContainerDied","Data":"e5ef8cec6dbfaec5e189072ecb59225c33fd0f6dca07d644897408b1481a0338"} Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.696822 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5ef8cec6dbfaec5e189072ecb59225c33fd0f6dca07d644897408b1481a0338" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.696883 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-75phb" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.701142 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerStarted","Data":"5b580ac53df6bbd1ee16e2ffe410709ae7ec969648d5fef8989283a1c83d0ecf"} Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.701324 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="ceilometer-notification-agent" containerID="cri-o://7b972a0943fad6eddc7d23af5bcaac6e173a5a1f4962be37ca805011abeb6f40" gracePeriod=30 Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.701595 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.701668 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="proxy-httpd" containerID="cri-o://5b580ac53df6bbd1ee16e2ffe410709ae7ec969648d5fef8989283a1c83d0ecf" gracePeriod=30 Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.701731 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="sg-core" containerID="cri-o://0b7f558a23c54f3be65fae1c6af58440967097dbff23cb59f45d386b7497b879" gracePeriod=30 Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.706899 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bvn26" event={"ID":"72b19763-eb29-45ca-9431-8791543dee83","Type":"ContainerDied","Data":"e6fdfecf1be38a65cd4b39854056f86579d49e65ba305543a1df6ba88e907bb7"} Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.706941 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6fdfecf1be38a65cd4b39854056f86579d49e65ba305543a1df6ba88e907bb7" Sep 30 03:13:27 crc kubenswrapper[4744]: I0930 03:13:27.707001 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bvn26" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.179454 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-578ccf57db-dnd4k"] Sep 30 03:13:28 crc kubenswrapper[4744]: E0930 03:13:28.179796 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerName="init" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.179807 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerName="init" Sep 30 03:13:28 crc kubenswrapper[4744]: E0930 03:13:28.179825 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b118b5fa-982e-4bd6-a6dc-5d2015b3b399" containerName="barbican-db-sync" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.179831 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b118b5fa-982e-4bd6-a6dc-5d2015b3b399" containerName="barbican-db-sync" Sep 30 03:13:28 crc kubenswrapper[4744]: E0930 03:13:28.179864 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerName="dnsmasq-dns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.179871 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerName="dnsmasq-dns" Sep 30 03:13:28 crc kubenswrapper[4744]: E0930 03:13:28.179883 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b19763-eb29-45ca-9431-8791543dee83" containerName="cinder-db-sync" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.179888 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b19763-eb29-45ca-9431-8791543dee83" containerName="cinder-db-sync" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.180052 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2c1a71-47b6-4b19-bb2a-ba36eafd1315" containerName="dnsmasq-dns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.180066 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b118b5fa-982e-4bd6-a6dc-5d2015b3b399" containerName="barbican-db-sync" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.180075 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b19763-eb29-45ca-9431-8791543dee83" containerName="cinder-db-sync" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.180988 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.186706 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.186867 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xn7nl" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.186989 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.198511 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-58944b8f99-bl9hx"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.201403 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.210950 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221676 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65s2\" (UniqueName: \"kubernetes.io/projected/fb5969c0-4230-4813-9009-546eda8657eb-kube-api-access-h65s2\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221714 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-config-data-custom\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221753 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-config-data\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221772 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-config-data\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221815 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-combined-ca-bundle\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221851 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-combined-ca-bundle\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221897 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb5969c0-4230-4813-9009-546eda8657eb-logs\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221926 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-config-data-custom\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221949 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqlz\" (UniqueName: \"kubernetes.io/projected/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-kube-api-access-dnqlz\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.221965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-logs\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.243978 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-578ccf57db-dnd4k"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.267432 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58944b8f99-bl9hx"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.317131 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.319891 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324274 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-combined-ca-bundle\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324349 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb5969c0-4230-4813-9009-546eda8657eb-logs\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324395 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-config-data-custom\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324419 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqlz\" (UniqueName: \"kubernetes.io/projected/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-kube-api-access-dnqlz\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324437 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-logs\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324460 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-config-data-custom\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324475 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65s2\" (UniqueName: \"kubernetes.io/projected/fb5969c0-4230-4813-9009-546eda8657eb-kube-api-access-h65s2\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324508 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-config-data\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324526 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-config-data\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.324564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-combined-ca-bundle\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.325167 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-logs\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.327027 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb5969c0-4230-4813-9009-546eda8657eb-logs\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.338647 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-j5xfr" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.338891 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.339006 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.339215 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.342786 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-config-data\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.342854 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-combined-ca-bundle\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.343348 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-config-data-custom\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.344888 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-config-data-custom\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.345272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5969c0-4230-4813-9009-546eda8657eb-combined-ca-bundle\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.351490 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-config-data\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.361437 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-fhw9j"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.362906 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.378934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65s2\" (UniqueName: \"kubernetes.io/projected/fb5969c0-4230-4813-9009-546eda8657eb-kube-api-access-h65s2\") pod \"barbican-worker-58944b8f99-bl9hx\" (UID: \"fb5969c0-4230-4813-9009-546eda8657eb\") " pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.379042 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.395325 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqlz\" (UniqueName: \"kubernetes.io/projected/89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e-kube-api-access-dnqlz\") pod \"barbican-keystone-listener-578ccf57db-dnd4k\" (UID: \"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e\") " pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.411456 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.412898 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.417701 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426008 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426096 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426166 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426192 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426213 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-config\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426226 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426244 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426261 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4wl\" (UniqueName: \"kubernetes.io/projected/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-kube-api-access-kt4wl\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426304 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426325 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnlzj\" (UniqueName: \"kubernetes.io/projected/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-kube-api-access-fnlzj\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.426342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.442114 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-fhw9j"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.457567 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.458926 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.460610 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.477462 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.479648 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.513918 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.537901 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.537972 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-sys\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538018 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-run\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538043 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-config\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538066 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538089 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538111 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-lib-modules\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538133 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4wl\" (UniqueName: \"kubernetes.io/projected/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-kube-api-access-kt4wl\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538155 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538182 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538207 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538245 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538271 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538299 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnlzj\" (UniqueName: \"kubernetes.io/projected/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-kube-api-access-fnlzj\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538322 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538340 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538375 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538438 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzw9\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-kube-api-access-xrzw9\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538460 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538510 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-sys\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538532 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-dev\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538553 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538576 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538592 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45svz\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-kube-api-access-45svz\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538617 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538639 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538661 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538691 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538709 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-run\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538736 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538758 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538778 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-scripts\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538797 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538823 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538845 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-ceph\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538868 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-dev\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538885 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538914 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538933 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.538958 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.539039 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.539058 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.541789 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.542313 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.545531 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.550424 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58944b8f99-bl9hx" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.553453 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.554090 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.555575 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.560328 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-config\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.563678 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.608937 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnlzj\" (UniqueName: \"kubernetes.io/projected/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-kube-api-access-fnlzj\") pod \"dnsmasq-dns-d68b9cb4c-fhw9j\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.616510 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4wl\" (UniqueName: \"kubernetes.io/projected/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-kube-api-access-kt4wl\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.618061 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.625228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694031 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694150 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzw9\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-kube-api-access-xrzw9\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694193 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694229 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-sys\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694257 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-dev\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694278 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694302 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45svz\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-kube-api-access-45svz\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694337 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694358 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694393 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694443 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-run\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694472 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-scripts\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694534 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-ceph\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694575 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-dev\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694589 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694627 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694642 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694669 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694691 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694730 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-sys\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694752 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694768 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-run\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694797 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-lib-modules\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694814 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694841 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694860 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.694903 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.704328 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.704422 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.704586 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.704845 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.704874 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-sys\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.704896 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-dev\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.704932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.711544 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.711731 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.712525 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.712565 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-run\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.712590 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-lib-modules\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.712630 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.720564 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.721034 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.721165 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.721188 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-sys\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.721224 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.725279 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.725326 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-dev\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.725344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.725368 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-run\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.725910 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-nvme\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.731822 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-ceph\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.732108 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.732993 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-fhw9j"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.733717 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.745035 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-h2rns"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.746641 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.749164 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.757905 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-h2rns"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.774163 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.774958 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.775444 4744 generic.go:334] "Generic (PLEG): container finished" podID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerID="5b580ac53df6bbd1ee16e2ffe410709ae7ec969648d5fef8989283a1c83d0ecf" exitCode=0 Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.775473 4744 generic.go:334] "Generic (PLEG): container finished" podID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerID="0b7f558a23c54f3be65fae1c6af58440967097dbff23cb59f45d386b7497b879" exitCode=2 Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.775495 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerDied","Data":"5b580ac53df6bbd1ee16e2ffe410709ae7ec969648d5fef8989283a1c83d0ecf"} Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.775522 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerDied","Data":"0b7f558a23c54f3be65fae1c6af58440967097dbff23cb59f45d386b7497b879"} Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.775552 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzw9\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-kube-api-access-xrzw9\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.781630 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6db48c9776-bw4r7"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.783158 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.789539 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.790714 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.790788 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.792337 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.794896 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data-custom\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.805285 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.806054 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45svz\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-kube-api-access-45svz\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.806197 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-scripts\") pod \"cinder-backup-0\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.815097 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.821052 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.838067 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6db48c9776-bw4r7"] Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.879893 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.888913 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900493 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-combined-ca-bundle\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900530 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-svc\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900566 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900585 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-logs\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900633 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900651 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900681 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqslg\" (UniqueName: \"kubernetes.io/projected/4af1755b-5573-4aee-aced-42d4b10bcebc-kube-api-access-rqslg\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900702 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data-custom\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900720 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-scripts\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900737 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900750 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.900840 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af1755b-5573-4aee-aced-42d4b10bcebc-logs\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.902740 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.902790 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.902828 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7klp\" (UniqueName: \"kubernetes.io/projected/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-kube-api-access-p7klp\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.902942 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.902965 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-config\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:28 crc kubenswrapper[4744]: I0930 03:13:28.902996 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfc47\" (UniqueName: \"kubernetes.io/projected/31343c55-58c4-4c6a-854c-36bc13beb817-kube-api-access-hfc47\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004547 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004753 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-logs\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004804 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004836 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqslg\" (UniqueName: \"kubernetes.io/projected/4af1755b-5573-4aee-aced-42d4b10bcebc-kube-api-access-rqslg\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004854 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data-custom\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004873 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-scripts\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004890 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004904 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004935 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af1755b-5573-4aee-aced-42d4b10bcebc-logs\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004963 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004981 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.004998 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7klp\" (UniqueName: \"kubernetes.io/projected/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-kube-api-access-p7klp\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.005044 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.005063 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-config\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.005083 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfc47\" (UniqueName: \"kubernetes.io/projected/31343c55-58c4-4c6a-854c-36bc13beb817-kube-api-access-hfc47\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.005114 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-combined-ca-bundle\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.005128 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-svc\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.006184 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-svc\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.007084 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-logs\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.007689 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af1755b-5573-4aee-aced-42d4b10bcebc-logs\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.008574 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.009410 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-config\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.009477 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.009954 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.009360 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.026306 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-combined-ca-bundle\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.030026 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data-custom\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.030552 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.034458 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.034661 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.036317 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfc47\" (UniqueName: \"kubernetes.io/projected/31343c55-58c4-4c6a-854c-36bc13beb817-kube-api-access-hfc47\") pod \"dnsmasq-dns-5784cf869f-h2rns\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.039702 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-scripts\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.039948 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7klp\" (UniqueName: \"kubernetes.io/projected/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-kube-api-access-p7klp\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.041433 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data\") pod \"cinder-api-0\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.061048 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqslg\" (UniqueName: \"kubernetes.io/projected/4af1755b-5573-4aee-aced-42d4b10bcebc-kube-api-access-rqslg\") pod \"barbican-api-6db48c9776-bw4r7\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.116212 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78db449746-kg7zl" podUID="ff31735f-472e-4b3a-8d81-bc5c392aec09" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.165186 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.185004 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.204544 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.220040 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-578ccf57db-dnd4k"] Sep 30 03:13:29 crc kubenswrapper[4744]: W0930 03:13:29.227941 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89d0d6d3_9ea4_4d48_b7e9_3dfcfc4ba56e.slice/crio-9eb5b3ee6f523110ea68b98a00943f32268cdbe1b7225141a2990d69156e66b2 WatchSource:0}: Error finding container 9eb5b3ee6f523110ea68b98a00943f32268cdbe1b7225141a2990d69156e66b2: Status 404 returned error can't find the container with id 9eb5b3ee6f523110ea68b98a00943f32268cdbe1b7225141a2990d69156e66b2 Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.413459 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58944b8f99-bl9hx"] Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.543872 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.543910 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-fhw9j"] Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.713318 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.790674 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" event={"ID":"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e","Type":"ContainerStarted","Data":"9eb5b3ee6f523110ea68b98a00943f32268cdbe1b7225141a2990d69156e66b2"} Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.793347 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" podUID="b2e0b913-a1f5-4c3f-b445-705ba5348e5d" containerName="init" containerID="cri-o://630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a" gracePeriod=10 Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.793850 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-h2rns"] Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.793896 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" event={"ID":"b2e0b913-a1f5-4c3f-b445-705ba5348e5d","Type":"ContainerStarted","Data":"630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a"} Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.793916 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" event={"ID":"b2e0b913-a1f5-4c3f-b445-705ba5348e5d","Type":"ContainerStarted","Data":"d84ae1741ce9c952b6d1ea0578307b8e0f8c74e335f445702e23cce00e750abd"} Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.798646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea1a6e80-d761-4986-b5e1-b6f557bb65b2","Type":"ContainerStarted","Data":"9e302106c75ba319383cc5b2201ec23500db489f634cb6223d62a0e7f4ac2c3a"} Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.805094 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7854f055-f44d-4abb-89a1-c11c436fc5bd","Type":"ContainerStarted","Data":"dcd2810580f28695f706bb9f701e5f40f7d252904d76b3a7df7ffb730251071a"} Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.807860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58944b8f99-bl9hx" event={"ID":"fb5969c0-4230-4813-9009-546eda8657eb","Type":"ContainerStarted","Data":"c6cc81cc4cdc8097de77ec752a15fe932df3b126402cb0ff3aacaef646ec142a"} Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.812113 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6db48c9776-bw4r7"] Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.828719 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:29 crc kubenswrapper[4744]: I0930 03:13:29.924915 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.519178 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.691712 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-config\") pod \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.692873 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnlzj\" (UniqueName: \"kubernetes.io/projected/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-kube-api-access-fnlzj\") pod \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.692920 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-nb\") pod \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.693050 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-swift-storage-0\") pod \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.693153 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-svc\") pod \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.693170 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-sb\") pod \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\" (UID: \"b2e0b913-a1f5-4c3f-b445-705ba5348e5d\") " Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.699482 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-kube-api-access-fnlzj" (OuterVolumeSpecName: "kube-api-access-fnlzj") pod "b2e0b913-a1f5-4c3f-b445-705ba5348e5d" (UID: "b2e0b913-a1f5-4c3f-b445-705ba5348e5d"). InnerVolumeSpecName "kube-api-access-fnlzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.718051 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b2e0b913-a1f5-4c3f-b445-705ba5348e5d" (UID: "b2e0b913-a1f5-4c3f-b445-705ba5348e5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.718886 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b2e0b913-a1f5-4c3f-b445-705ba5348e5d" (UID: "b2e0b913-a1f5-4c3f-b445-705ba5348e5d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.726035 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b2e0b913-a1f5-4c3f-b445-705ba5348e5d" (UID: "b2e0b913-a1f5-4c3f-b445-705ba5348e5d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.727858 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-config" (OuterVolumeSpecName: "config") pod "b2e0b913-a1f5-4c3f-b445-705ba5348e5d" (UID: "b2e0b913-a1f5-4c3f-b445-705ba5348e5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.735612 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b2e0b913-a1f5-4c3f-b445-705ba5348e5d" (UID: "b2e0b913-a1f5-4c3f-b445-705ba5348e5d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.795491 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.795519 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.795531 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnlzj\" (UniqueName: \"kubernetes.io/projected/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-kube-api-access-fnlzj\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.795541 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.795549 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.795558 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2e0b913-a1f5-4c3f-b445-705ba5348e5d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.837521 4744 generic.go:334] "Generic (PLEG): container finished" podID="b2e0b913-a1f5-4c3f-b445-705ba5348e5d" containerID="630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a" exitCode=0 Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.837581 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" event={"ID":"b2e0b913-a1f5-4c3f-b445-705ba5348e5d","Type":"ContainerDied","Data":"630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.837606 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" event={"ID":"b2e0b913-a1f5-4c3f-b445-705ba5348e5d","Type":"ContainerDied","Data":"d84ae1741ce9c952b6d1ea0578307b8e0f8c74e335f445702e23cce00e750abd"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.837621 4744 scope.go:117] "RemoveContainer" containerID="630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.837724 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-fhw9j" Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.837804 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.840181 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db48c9776-bw4r7" event={"ID":"4af1755b-5573-4aee-aced-42d4b10bcebc","Type":"ContainerStarted","Data":"9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.840216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db48c9776-bw4r7" event={"ID":"4af1755b-5573-4aee-aced-42d4b10bcebc","Type":"ContainerStarted","Data":"b2bf359e1a7aaf1b529ce2a4727738cf98f20ff5f5a45fc85326c64a07f4b5f6"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.843226 4744 generic.go:334] "Generic (PLEG): container finished" podID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerID="7b972a0943fad6eddc7d23af5bcaac6e173a5a1f4962be37ca805011abeb6f40" exitCode=0 Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.843272 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerDied","Data":"7b972a0943fad6eddc7d23af5bcaac6e173a5a1f4962be37ca805011abeb6f40"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.844878 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5eb2ed9-b6a4-46a2-80aa-934a62621e03","Type":"ContainerStarted","Data":"0ef9df2c9cb3236927f61612d8d8ed4c3b065193135812ceba8e482ae4852694"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.845919 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"0d5e764c-92e1-48ab-a324-a8bac2865222","Type":"ContainerStarted","Data":"96de829a73993b5aa5d4c3406c4d263a305c67d25f29300538feb2463aa70f15"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.847884 4744 generic.go:334] "Generic (PLEG): container finished" podID="31343c55-58c4-4c6a-854c-36bc13beb817" containerID="9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f" exitCode=0 Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.847932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" event={"ID":"31343c55-58c4-4c6a-854c-36bc13beb817","Type":"ContainerDied","Data":"9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.847958 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" event={"ID":"31343c55-58c4-4c6a-854c-36bc13beb817","Type":"ContainerStarted","Data":"0ce05300e7985579f4b47cbb7eb1012b1a845783b6824b15c88a7504b6bf6da9"} Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.936004 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-fhw9j"] Sep 30 03:13:30 crc kubenswrapper[4744]: I0930 03:13:30.948975 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-fhw9j"] Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.468719 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.516308 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e0b913-a1f5-4c3f-b445-705ba5348e5d" path="/var/lib/kubelet/pods/b2e0b913-a1f5-4c3f-b445-705ba5348e5d/volumes" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.609497 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-scripts\") pod \"c56c5a65-d4fe-4772-ba30-eae95674c422\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.609605 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-combined-ca-bundle\") pod \"c56c5a65-d4fe-4772-ba30-eae95674c422\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.609854 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj846\" (UniqueName: \"kubernetes.io/projected/c56c5a65-d4fe-4772-ba30-eae95674c422-kube-api-access-kj846\") pod \"c56c5a65-d4fe-4772-ba30-eae95674c422\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.610159 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-run-httpd\") pod \"c56c5a65-d4fe-4772-ba30-eae95674c422\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.610291 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-config-data\") pod \"c56c5a65-d4fe-4772-ba30-eae95674c422\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.610515 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-sg-core-conf-yaml\") pod \"c56c5a65-d4fe-4772-ba30-eae95674c422\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.610603 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-log-httpd\") pod \"c56c5a65-d4fe-4772-ba30-eae95674c422\" (UID: \"c56c5a65-d4fe-4772-ba30-eae95674c422\") " Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.611111 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c56c5a65-d4fe-4772-ba30-eae95674c422" (UID: "c56c5a65-d4fe-4772-ba30-eae95674c422"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.611984 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c56c5a65-d4fe-4772-ba30-eae95674c422" (UID: "c56c5a65-d4fe-4772-ba30-eae95674c422"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.612516 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.612544 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c56c5a65-d4fe-4772-ba30-eae95674c422-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.617550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56c5a65-d4fe-4772-ba30-eae95674c422-kube-api-access-kj846" (OuterVolumeSpecName: "kube-api-access-kj846") pod "c56c5a65-d4fe-4772-ba30-eae95674c422" (UID: "c56c5a65-d4fe-4772-ba30-eae95674c422"). InnerVolumeSpecName "kube-api-access-kj846". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.618811 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-scripts" (OuterVolumeSpecName: "scripts") pod "c56c5a65-d4fe-4772-ba30-eae95674c422" (UID: "c56c5a65-d4fe-4772-ba30-eae95674c422"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.647528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c56c5a65-d4fe-4772-ba30-eae95674c422" (UID: "c56c5a65-d4fe-4772-ba30-eae95674c422"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.692500 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c56c5a65-d4fe-4772-ba30-eae95674c422" (UID: "c56c5a65-d4fe-4772-ba30-eae95674c422"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.711641 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-config-data" (OuterVolumeSpecName: "config-data") pod "c56c5a65-d4fe-4772-ba30-eae95674c422" (UID: "c56c5a65-d4fe-4772-ba30-eae95674c422"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.713627 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.713652 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.713661 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.713670 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj846\" (UniqueName: \"kubernetes.io/projected/c56c5a65-d4fe-4772-ba30-eae95674c422-kube-api-access-kj846\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.713682 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56c5a65-d4fe-4772-ba30-eae95674c422-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.854681 4744 scope.go:117] "RemoveContainer" containerID="630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a" Sep 30 03:13:31 crc kubenswrapper[4744]: E0930 03:13:31.855259 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a\": container with ID starting with 630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a not found: ID does not exist" containerID="630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.855296 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a"} err="failed to get container status \"630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a\": rpc error: code = NotFound desc = could not find container \"630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a\": container with ID starting with 630f04882423872bfb9cec21a607f719edc48d7dfc8935efa92fd5ae25be6b2a not found: ID does not exist" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.859325 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c56c5a65-d4fe-4772-ba30-eae95674c422","Type":"ContainerDied","Data":"8306c8a75be436fa60369da62fb6f3bf76195618f96a45456fd6e25e5a834ac4"} Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.859387 4744 scope.go:117] "RemoveContainer" containerID="5b580ac53df6bbd1ee16e2ffe410709ae7ec969648d5fef8989283a1c83d0ecf" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.859480 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.873846 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5eb2ed9-b6a4-46a2-80aa-934a62621e03","Type":"ContainerStarted","Data":"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c"} Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.877014 4744 generic.go:334] "Generic (PLEG): container finished" podID="a24c42a2-4afa-4c32-ba87-18251fd1345a" containerID="869acb7ec54dcc781e8364e30d14cdf03deb15f640b00501fbab0d05595a4f44" exitCode=0 Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.877087 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pwqjw" event={"ID":"a24c42a2-4afa-4c32-ba87-18251fd1345a","Type":"ContainerDied","Data":"869acb7ec54dcc781e8364e30d14cdf03deb15f640b00501fbab0d05595a4f44"} Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.879608 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db48c9776-bw4r7" event={"ID":"4af1755b-5573-4aee-aced-42d4b10bcebc","Type":"ContainerStarted","Data":"6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1"} Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.879788 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.879811 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:31 crc kubenswrapper[4744]: I0930 03:13:31.925813 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6db48c9776-bw4r7" podStartSLOduration=3.925793821 podStartE2EDuration="3.925793821s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:31.911507898 +0000 UTC m=+1139.084727872" watchObservedRunningTime="2025-09-30 03:13:31.925793821 +0000 UTC m=+1139.099013785" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.112916 4744 scope.go:117] "RemoveContainer" containerID="0b7f558a23c54f3be65fae1c6af58440967097dbff23cb59f45d386b7497b879" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.173264 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.180022 4744 scope.go:117] "RemoveContainer" containerID="7b972a0943fad6eddc7d23af5bcaac6e173a5a1f4962be37ca805011abeb6f40" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.185217 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.198335 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:13:32 crc kubenswrapper[4744]: E0930 03:13:32.199008 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="proxy-httpd" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199026 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="proxy-httpd" Sep 30 03:13:32 crc kubenswrapper[4744]: E0930 03:13:32.199050 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="sg-core" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199056 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="sg-core" Sep 30 03:13:32 crc kubenswrapper[4744]: E0930 03:13:32.199068 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e0b913-a1f5-4c3f-b445-705ba5348e5d" containerName="init" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199074 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e0b913-a1f5-4c3f-b445-705ba5348e5d" containerName="init" Sep 30 03:13:32 crc kubenswrapper[4744]: E0930 03:13:32.199091 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="ceilometer-notification-agent" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199097 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="ceilometer-notification-agent" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199540 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e0b913-a1f5-4c3f-b445-705ba5348e5d" containerName="init" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199712 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="sg-core" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199729 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="proxy-httpd" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.199747 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" containerName="ceilometer-notification-agent" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.201428 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.207077 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.208110 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.209543 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.337134 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-scripts\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.337329 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.337361 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-config-data\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.337476 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.337579 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-log-httpd\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.337663 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-run-httpd\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.337702 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qc4n\" (UniqueName: \"kubernetes.io/projected/08c01111-188d-4d73-be12-aac3feab4b02-kube-api-access-8qc4n\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440030 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-scripts\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440239 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440273 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-config-data\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440300 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440338 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-log-httpd\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440367 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-run-httpd\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440416 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qc4n\" (UniqueName: \"kubernetes.io/projected/08c01111-188d-4d73-be12-aac3feab4b02-kube-api-access-8qc4n\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-log-httpd\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.440939 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-run-httpd\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.447952 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-scripts\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.452551 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.455094 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.455783 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qc4n\" (UniqueName: \"kubernetes.io/projected/08c01111-188d-4d73-be12-aac3feab4b02-kube-api-access-8qc4n\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.456473 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-config-data\") pod \"ceilometer-0\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.534582 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.891855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" event={"ID":"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e","Type":"ContainerStarted","Data":"79b61c4ebc36b06cae86f82cf53d7b60af747503e8f590a58a462edf3067e260"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.892307 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" event={"ID":"89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e","Type":"ContainerStarted","Data":"fe67224882e016bbe4dd25dfc1fa1d80b3393397bbde546de7675586d43c7af3"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.893884 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea1a6e80-d761-4986-b5e1-b6f557bb65b2","Type":"ContainerStarted","Data":"418df48d044b0a8cc494017001bf3ece991bc433a8a912333d4b59ea1cd5fabb"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.897521 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7854f055-f44d-4abb-89a1-c11c436fc5bd","Type":"ContainerStarted","Data":"31b125f3ecaef0be11faf3853af66f1e39df34490f86fb0a01bbdfc186420ed7"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.897561 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7854f055-f44d-4abb-89a1-c11c436fc5bd","Type":"ContainerStarted","Data":"c7b39fca39873d8c9349f9d620bc66e7e66a2c4bba8b884794ecd81a50f63d12"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.913537 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-578ccf57db-dnd4k" podStartSLOduration=2.261483504 podStartE2EDuration="4.913520186s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="2025-09-30 03:13:29.246932717 +0000 UTC m=+1136.420152691" lastFinishedPulling="2025-09-30 03:13:31.898969399 +0000 UTC m=+1139.072189373" observedRunningTime="2025-09-30 03:13:32.911074361 +0000 UTC m=+1140.084294345" watchObservedRunningTime="2025-09-30 03:13:32.913520186 +0000 UTC m=+1140.086740160" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.930911 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58944b8f99-bl9hx" event={"ID":"fb5969c0-4230-4813-9009-546eda8657eb","Type":"ContainerStarted","Data":"589851f8bbbcf66e8e0fb0851fe33946e07dfe8caa75a2167577901d217e9dda"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.930959 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58944b8f99-bl9hx" event={"ID":"fb5969c0-4230-4813-9009-546eda8657eb","Type":"ContainerStarted","Data":"75976d47453617de630c552b4d60dfc56019b9fd524a912bacd1620b2b0fa589"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.940803 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.768389562 podStartE2EDuration="4.940788122s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="2025-09-30 03:13:29.723928346 +0000 UTC m=+1136.897148320" lastFinishedPulling="2025-09-30 03:13:31.896326906 +0000 UTC m=+1139.069546880" observedRunningTime="2025-09-30 03:13:32.939525213 +0000 UTC m=+1140.112745197" watchObservedRunningTime="2025-09-30 03:13:32.940788122 +0000 UTC m=+1140.114008096" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.963163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5eb2ed9-b6a4-46a2-80aa-934a62621e03","Type":"ContainerStarted","Data":"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213"} Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.963319 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api-log" containerID="cri-o://e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c" gracePeriod=30 Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.963571 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.963831 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api" containerID="cri-o://1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213" gracePeriod=30 Sep 30 03:13:32 crc kubenswrapper[4744]: I0930 03:13:32.967425 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-58944b8f99-bl9hx" podStartSLOduration=2.479026484 podStartE2EDuration="4.967406848s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="2025-09-30 03:13:29.425765985 +0000 UTC m=+1136.598985959" lastFinishedPulling="2025-09-30 03:13:31.914146349 +0000 UTC m=+1139.087366323" observedRunningTime="2025-09-30 03:13:32.961254237 +0000 UTC m=+1140.134474211" watchObservedRunningTime="2025-09-30 03:13:32.967406848 +0000 UTC m=+1140.140626822" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.031320 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.031303771 podStartE2EDuration="5.031303771s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:33.000258487 +0000 UTC m=+1140.173478471" watchObservedRunningTime="2025-09-30 03:13:33.031303771 +0000 UTC m=+1140.204523745" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.037865 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.038038 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"0d5e764c-92e1-48ab-a324-a8bac2865222","Type":"ContainerStarted","Data":"c58f92b97d3150ec8bd56d6c1c74d3e1c513a32fb545a802ad231a0ccc810148"} Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.038079 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"0d5e764c-92e1-48ab-a324-a8bac2865222","Type":"ContainerStarted","Data":"94a7e815f90b3be48a274556c8c29cae05a2c068205fc1c34290a5f4f91301a1"} Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.046626 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" event={"ID":"31343c55-58c4-4c6a-854c-36bc13beb817","Type":"ContainerStarted","Data":"ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955"} Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.047330 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.071217 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.060482244 podStartE2EDuration="5.071201198s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="2025-09-30 03:13:29.888240924 +0000 UTC m=+1137.061460898" lastFinishedPulling="2025-09-30 03:13:31.898959878 +0000 UTC m=+1139.072179852" observedRunningTime="2025-09-30 03:13:33.063809559 +0000 UTC m=+1140.237029533" watchObservedRunningTime="2025-09-30 03:13:33.071201198 +0000 UTC m=+1140.244421172" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.082750 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" podStartSLOduration=5.082730566 podStartE2EDuration="5.082730566s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:33.080701113 +0000 UTC m=+1140.253921087" watchObservedRunningTime="2025-09-30 03:13:33.082730566 +0000 UTC m=+1140.255950540" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.522083 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56c5a65-d4fe-4772-ba30-eae95674c422" path="/var/lib/kubelet/pods/c56c5a65-d4fe-4772-ba30-eae95674c422/volumes" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.609705 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pwqjw" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.782243 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-config-data\") pod \"a24c42a2-4afa-4c32-ba87-18251fd1345a\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.782299 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-combined-ca-bundle\") pod \"a24c42a2-4afa-4c32-ba87-18251fd1345a\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.782385 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4skk\" (UniqueName: \"kubernetes.io/projected/a24c42a2-4afa-4c32-ba87-18251fd1345a-kube-api-access-q4skk\") pod \"a24c42a2-4afa-4c32-ba87-18251fd1345a\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.782423 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-job-config-data\") pod \"a24c42a2-4afa-4c32-ba87-18251fd1345a\" (UID: \"a24c42a2-4afa-4c32-ba87-18251fd1345a\") " Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.795598 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "a24c42a2-4afa-4c32-ba87-18251fd1345a" (UID: "a24c42a2-4afa-4c32-ba87-18251fd1345a"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.824642 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24c42a2-4afa-4c32-ba87-18251fd1345a-kube-api-access-q4skk" (OuterVolumeSpecName: "kube-api-access-q4skk") pod "a24c42a2-4afa-4c32-ba87-18251fd1345a" (UID: "a24c42a2-4afa-4c32-ba87-18251fd1345a"). InnerVolumeSpecName "kube-api-access-q4skk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.841498 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-config-data" (OuterVolumeSpecName: "config-data") pod "a24c42a2-4afa-4c32-ba87-18251fd1345a" (UID: "a24c42a2-4afa-4c32-ba87-18251fd1345a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.871565 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a24c42a2-4afa-4c32-ba87-18251fd1345a" (UID: "a24c42a2-4afa-4c32-ba87-18251fd1345a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.880512 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.889363 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.889960 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4skk\" (UniqueName: \"kubernetes.io/projected/a24c42a2-4afa-4c32-ba87-18251fd1345a-kube-api-access-q4skk\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.889998 4744 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-job-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.890008 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.890018 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a24c42a2-4afa-4c32-ba87-18251fd1345a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:33 crc kubenswrapper[4744]: I0930 03:13:33.967644 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.095059 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-combined-ca-bundle\") pod \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.095142 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-etc-machine-id\") pod \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.095164 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data\") pod \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.095192 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-logs\") pod \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.095271 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-scripts\") pod \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.095304 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data-custom\") pod \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.095414 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7klp\" (UniqueName: \"kubernetes.io/projected/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-kube-api-access-p7klp\") pod \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\" (UID: \"c5eb2ed9-b6a4-46a2-80aa-934a62621e03\") " Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.097906 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-logs" (OuterVolumeSpecName: "logs") pod "c5eb2ed9-b6a4-46a2-80aa-934a62621e03" (UID: "c5eb2ed9-b6a4-46a2-80aa-934a62621e03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.097946 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c5eb2ed9-b6a4-46a2-80aa-934a62621e03" (UID: "c5eb2ed9-b6a4-46a2-80aa-934a62621e03"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.098329 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.098387 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.118795 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-scripts" (OuterVolumeSpecName: "scripts") pod "c5eb2ed9-b6a4-46a2-80aa-934a62621e03" (UID: "c5eb2ed9-b6a4-46a2-80aa-934a62621e03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.119157 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-kube-api-access-p7klp" (OuterVolumeSpecName: "kube-api-access-p7klp") pod "c5eb2ed9-b6a4-46a2-80aa-934a62621e03" (UID: "c5eb2ed9-b6a4-46a2-80aa-934a62621e03"). InnerVolumeSpecName "kube-api-access-p7klp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.120471 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c5eb2ed9-b6a4-46a2-80aa-934a62621e03" (UID: "c5eb2ed9-b6a4-46a2-80aa-934a62621e03"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.128058 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-pwqjw" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.130624 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-pwqjw" event={"ID":"a24c42a2-4afa-4c32-ba87-18251fd1345a","Type":"ContainerDied","Data":"1518c5f2cdb2a35b74e22c57f80274f509823a96a590622875d658645b7aeed9"} Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.130667 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1518c5f2cdb2a35b74e22c57f80274f509823a96a590622875d658645b7aeed9" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.142130 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerStarted","Data":"b836fb9ea398818bd93a709539d863d50a982dad436139d298b1b3d65661d7b6"} Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.158962 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerID="1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213" exitCode=0 Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.158992 4744 generic.go:334] "Generic (PLEG): container finished" podID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerID="e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c" exitCode=143 Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.159027 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5eb2ed9-b6a4-46a2-80aa-934a62621e03","Type":"ContainerDied","Data":"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213"} Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.159051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5eb2ed9-b6a4-46a2-80aa-934a62621e03","Type":"ContainerDied","Data":"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c"} Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.159061 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5eb2ed9-b6a4-46a2-80aa-934a62621e03","Type":"ContainerDied","Data":"0ef9df2c9cb3236927f61612d8d8ed4c3b065193135812ceba8e482ae4852694"} Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.159075 4744 scope.go:117] "RemoveContainer" containerID="1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.159180 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.166668 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5eb2ed9-b6a4-46a2-80aa-934a62621e03" (UID: "c5eb2ed9-b6a4-46a2-80aa-934a62621e03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.174116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea1a6e80-d761-4986-b5e1-b6f557bb65b2","Type":"ContainerStarted","Data":"b3be161428522859d0f371e5cf866cfc5cd68d85d26265fdd82f1883f371573c"} Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.204436 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.205092 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.205106 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.205119 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7klp\" (UniqueName: \"kubernetes.io/projected/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-kube-api-access-p7klp\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.239277 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.156630341 podStartE2EDuration="6.239259329s" podCreationTimestamp="2025-09-30 03:13:28 +0000 UTC" firstStartedPulling="2025-09-30 03:13:29.513664203 +0000 UTC m=+1136.686884177" lastFinishedPulling="2025-09-30 03:13:30.596293191 +0000 UTC m=+1137.769513165" observedRunningTime="2025-09-30 03:13:34.222334593 +0000 UTC m=+1141.395554567" watchObservedRunningTime="2025-09-30 03:13:34.239259329 +0000 UTC m=+1141.412479313" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.355299 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.355354 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.355485 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.355875 4744 scope.go:117] "RemoveContainer" containerID="e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.356198 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d72e9221a902ba71a0038b939d0d12d57f148cf38a3a98c9981e273e6748a54"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.356250 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://1d72e9221a902ba71a0038b939d0d12d57f148cf38a3a98c9981e273e6748a54" gracePeriod=600 Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.387290 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: E0930 03:13:34.389094 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24c42a2-4afa-4c32-ba87-18251fd1345a" containerName="manila-db-sync" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.389114 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24c42a2-4afa-4c32-ba87-18251fd1345a" containerName="manila-db-sync" Sep 30 03:13:34 crc kubenswrapper[4744]: E0930 03:13:34.389130 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.389139 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api" Sep 30 03:13:34 crc kubenswrapper[4744]: E0930 03:13:34.389172 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api-log" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.389181 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api-log" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.389435 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.389465 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24c42a2-4afa-4c32-ba87-18251fd1345a" containerName="manila-db-sync" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.389496 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" containerName="cinder-api-log" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.396034 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.406173 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.406359 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.406419 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-ljrm7" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.406552 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.410818 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.412184 4744 scope.go:117] "RemoveContainer" containerID="1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.413191 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.416528 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Sep 30 03:13:34 crc kubenswrapper[4744]: E0930 03:13:34.417376 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213\": container with ID starting with 1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213 not found: ID does not exist" containerID="1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.417432 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213"} err="failed to get container status \"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213\": rpc error: code = NotFound desc = could not find container \"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213\": container with ID starting with 1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213 not found: ID does not exist" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.417454 4744 scope.go:117] "RemoveContainer" containerID="e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c" Sep 30 03:13:34 crc kubenswrapper[4744]: E0930 03:13:34.420614 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c\": container with ID starting with e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c not found: ID does not exist" containerID="e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.420652 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c"} err="failed to get container status \"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c\": rpc error: code = NotFound desc = could not find container \"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c\": container with ID starting with e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c not found: ID does not exist" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.420684 4744 scope.go:117] "RemoveContainer" containerID="1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.423314 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213"} err="failed to get container status \"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213\": rpc error: code = NotFound desc = could not find container \"1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213\": container with ID starting with 1fe78b344e75ca9ef2ad891d0f5d85d1ebc03826aa56cfa6a189216d5fd4c213 not found: ID does not exist" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.423344 4744 scope.go:117] "RemoveContainer" containerID="e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.433541 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.436709 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c"} err="failed to get container status \"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c\": rpc error: code = NotFound desc = could not find container \"e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c\": container with ID starting with e25117400a9fcb8a7a2bd6f7ff499e46f9760c64efa6bcedd5fc47b3e5958d8c not found: ID does not exist" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.443670 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.470351 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data" (OuterVolumeSpecName: "config-data") pod "c5eb2ed9-b6a4-46a2-80aa-934a62621e03" (UID: "c5eb2ed9-b6a4-46a2-80aa-934a62621e03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.482187 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-h2rns"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.499844 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-s4m4j"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.501381 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.512871 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-s4m4j"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529632 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrw2s\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-kube-api-access-rrw2s\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529702 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hqgz\" (UniqueName: \"kubernetes.io/projected/8410c494-56fd-4498-ad12-e0c6dad119bf-kube-api-access-8hqgz\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529725 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-config\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529745 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529759 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529809 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-svc\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529841 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs2ns\" (UniqueName: \"kubernetes.io/projected/409d892b-c223-43a8-9550-ff50fb759e2b-kube-api-access-fs2ns\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529877 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529937 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529956 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529971 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.529991 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530010 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530063 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-scripts\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530112 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-scripts\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530140 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530190 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-ceph\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530209 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530263 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8410c494-56fd-4498-ad12-e0c6dad119bf-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.530736 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5eb2ed9-b6a4-46a2-80aa-934a62621e03-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.571310 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.573017 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.576208 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.587508 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.631915 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8410c494-56fd-4498-ad12-e0c6dad119bf-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.631969 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrw2s\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-kube-api-access-rrw2s\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632001 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hqgz\" (UniqueName: \"kubernetes.io/projected/8410c494-56fd-4498-ad12-e0c6dad119bf-kube-api-access-8hqgz\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632019 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-config\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632036 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632050 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632084 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-svc\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632106 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs2ns\" (UniqueName: \"kubernetes.io/projected/409d892b-c223-43a8-9550-ff50fb759e2b-kube-api-access-fs2ns\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632127 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632157 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632175 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632190 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632207 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632254 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-scripts\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632272 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632289 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-scripts\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632308 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632336 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-ceph\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.632353 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.634464 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.634515 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8410c494-56fd-4498-ad12-e0c6dad119bf-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.635543 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-config\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.636885 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-svc\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.636901 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.638662 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.643292 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.643797 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.644344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.645661 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.650500 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hqgz\" (UniqueName: \"kubernetes.io/projected/8410c494-56fd-4498-ad12-e0c6dad119bf-kube-api-access-8hqgz\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.651012 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.651587 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-ceph\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.653274 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.654686 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.656731 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-scripts\") pod \"manila-scheduler-0\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.657097 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-scripts\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.659543 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrw2s\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-kube-api-access-rrw2s\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.660951 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs2ns\" (UniqueName: \"kubernetes.io/projected/409d892b-c223-43a8-9550-ff50fb759e2b-kube-api-access-fs2ns\") pod \"dnsmasq-dns-5865f9d689-s4m4j\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.663450 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.734010 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data-custom\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.734119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/837b73b3-fd73-48a3-888c-874871f8ce4c-etc-machine-id\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.734183 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6272\" (UniqueName: \"kubernetes.io/projected/837b73b3-fd73-48a3-888c-874871f8ce4c-kube-api-access-w6272\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.734217 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-scripts\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.734342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.734390 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.734426 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837b73b3-fd73-48a3-888c-874871f8ce4c-logs\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.769035 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.769592 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.813408 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.830663 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.837356 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.837427 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.846300 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.847821 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.849938 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.853212 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.853348 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.853418 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.853576 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.856551 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.856933 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.857046 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837b73b3-fd73-48a3-888c-874871f8ce4c-logs\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.857311 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837b73b3-fd73-48a3-888c-874871f8ce4c-logs\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.857372 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data-custom\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.857435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/837b73b3-fd73-48a3-888c-874871f8ce4c-etc-machine-id\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.857507 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6272\" (UniqueName: \"kubernetes.io/projected/837b73b3-fd73-48a3-888c-874871f8ce4c-kube-api-access-w6272\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.857545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-scripts\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.859467 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/837b73b3-fd73-48a3-888c-874871f8ce4c-etc-machine-id\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.860278 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-scripts\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.869121 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data-custom\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.885957 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6272\" (UniqueName: \"kubernetes.io/projected/837b73b3-fd73-48a3-888c-874871f8ce4c-kube-api-access-w6272\") pod \"manila-api-0\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.912709 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.959804 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5269829d-b1f7-4980-9550-d622fa40c1f1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.959857 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5269829d-b1f7-4980-9550-d622fa40c1f1-logs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.959907 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.959936 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.959954 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-config-data-custom\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.960013 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkzrc\" (UniqueName: \"kubernetes.io/projected/5269829d-b1f7-4980-9550-d622fa40c1f1-kube-api-access-zkzrc\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.960044 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-scripts\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.960074 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-config-data\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:34 crc kubenswrapper[4744]: I0930 03:13:34.960091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.068602 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkzrc\" (UniqueName: \"kubernetes.io/projected/5269829d-b1f7-4980-9550-d622fa40c1f1-kube-api-access-zkzrc\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.069666 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-scripts\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.070649 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-config-data\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.071328 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.071435 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5269829d-b1f7-4980-9550-d622fa40c1f1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.071938 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5269829d-b1f7-4980-9550-d622fa40c1f1-logs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.072144 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.072242 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.072314 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-config-data-custom\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.074569 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5269829d-b1f7-4980-9550-d622fa40c1f1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.076116 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5269829d-b1f7-4980-9550-d622fa40c1f1-logs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.076780 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-scripts\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.078825 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-config-data\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.079939 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.082705 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-config-data-custom\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.089448 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkzrc\" (UniqueName: \"kubernetes.io/projected/5269829d-b1f7-4980-9550-d622fa40c1f1-kube-api-access-zkzrc\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.114181 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.114455 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5269829d-b1f7-4980-9550-d622fa40c1f1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5269829d-b1f7-4980-9550-d622fa40c1f1\") " pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.238327 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.262342 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerStarted","Data":"2698a953a95589cf61a782bfc00177599f663e2ce99b6ac1637c69174697adca"} Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.325244 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="1d72e9221a902ba71a0038b939d0d12d57f148cf38a3a98c9981e273e6748a54" exitCode=0 Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.326999 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"1d72e9221a902ba71a0038b939d0d12d57f148cf38a3a98c9981e273e6748a54"} Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.327024 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"b7975d758249d48351e6b790ced251dcf0b3dce30fe61d2854cf2d73cc541951"} Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.327039 4744 scope.go:117] "RemoveContainer" containerID="8a5c6bc379bf988ae0369b42f93fd361d89694e20343a5b27933e4ef1594e651" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.528567 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5eb2ed9-b6a4-46a2-80aa-934a62621e03" path="/var/lib/kubelet/pods/c5eb2ed9-b6a4-46a2-80aa-934a62621e03/volumes" Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.600908 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-s4m4j"] Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.710610 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:13:35 crc kubenswrapper[4744]: I0930 03:13:35.844402 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.052144 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.083509 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.337244 4744 generic.go:334] "Generic (PLEG): container finished" podID="409d892b-c223-43a8-9550-ff50fb759e2b" containerID="0d1a2eb3851b60a0e03db74addc8f9de94facfb9132f4748e75576488014d6b4" exitCode=0 Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.337520 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" event={"ID":"409d892b-c223-43a8-9550-ff50fb759e2b","Type":"ContainerDied","Data":"0d1a2eb3851b60a0e03db74addc8f9de94facfb9132f4748e75576488014d6b4"} Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.337545 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" event={"ID":"409d892b-c223-43a8-9550-ff50fb759e2b","Type":"ContainerStarted","Data":"d04f89a472c0351f699d86f9936d5cfd55a1c2eb1af7636ce135b1d5db42ac2b"} Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.370726 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"837b73b3-fd73-48a3-888c-874871f8ce4c","Type":"ContainerStarted","Data":"774899f4a62f1ed7307a0fa11625ee9b6d1f9f01b490b96f1640972cdcffde8f"} Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.380225 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1486a12d-9554-48ca-899d-1286e1b5913b","Type":"ContainerStarted","Data":"ca7b83a4f6adefdcf036ab83869b72faff10f33aaf493403988d25b6e9204f82"} Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.387185 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerStarted","Data":"df817ee6232c687339fd952171095ffd9e3863a87c1ddb3ac1baaac7c0c7012c"} Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.390958 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8410c494-56fd-4498-ad12-e0c6dad119bf","Type":"ContainerStarted","Data":"d35ad5a933e3b5eb02f0e04ed41bded393636a86b0a75154de7d98ba4babea2a"} Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.413538 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" podUID="31343c55-58c4-4c6a-854c-36bc13beb817" containerName="dnsmasq-dns" containerID="cri-o://ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955" gracePeriod=10 Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.413824 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5269829d-b1f7-4980-9550-d622fa40c1f1","Type":"ContainerStarted","Data":"a5ab5ce3222cab8640aed19dbccbc784a9cdd2ded328c8f7f6616c69c6442443"} Sep 30 03:13:36 crc kubenswrapper[4744]: I0930 03:13:36.540736 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.129481 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.244444 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-sb\") pod \"31343c55-58c4-4c6a-854c-36bc13beb817\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.244502 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-swift-storage-0\") pod \"31343c55-58c4-4c6a-854c-36bc13beb817\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.244606 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-config\") pod \"31343c55-58c4-4c6a-854c-36bc13beb817\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.244664 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfc47\" (UniqueName: \"kubernetes.io/projected/31343c55-58c4-4c6a-854c-36bc13beb817-kube-api-access-hfc47\") pod \"31343c55-58c4-4c6a-854c-36bc13beb817\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.244776 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-nb\") pod \"31343c55-58c4-4c6a-854c-36bc13beb817\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.244797 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-svc\") pod \"31343c55-58c4-4c6a-854c-36bc13beb817\" (UID: \"31343c55-58c4-4c6a-854c-36bc13beb817\") " Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.286466 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31343c55-58c4-4c6a-854c-36bc13beb817-kube-api-access-hfc47" (OuterVolumeSpecName: "kube-api-access-hfc47") pod "31343c55-58c4-4c6a-854c-36bc13beb817" (UID: "31343c55-58c4-4c6a-854c-36bc13beb817"). InnerVolumeSpecName "kube-api-access-hfc47". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.349477 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfc47\" (UniqueName: \"kubernetes.io/projected/31343c55-58c4-4c6a-854c-36bc13beb817-kube-api-access-hfc47\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.385047 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31343c55-58c4-4c6a-854c-36bc13beb817" (UID: "31343c55-58c4-4c6a-854c-36bc13beb817"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.399188 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31343c55-58c4-4c6a-854c-36bc13beb817" (UID: "31343c55-58c4-4c6a-854c-36bc13beb817"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.405944 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "31343c55-58c4-4c6a-854c-36bc13beb817" (UID: "31343c55-58c4-4c6a-854c-36bc13beb817"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.412012 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31343c55-58c4-4c6a-854c-36bc13beb817" (UID: "31343c55-58c4-4c6a-854c-36bc13beb817"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.413862 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-config" (OuterVolumeSpecName: "config") pod "31343c55-58c4-4c6a-854c-36bc13beb817" (UID: "31343c55-58c4-4c6a-854c-36bc13beb817"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.438780 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"837b73b3-fd73-48a3-888c-874871f8ce4c","Type":"ContainerStarted","Data":"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7"} Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.447138 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5269829d-b1f7-4980-9550-d622fa40c1f1","Type":"ContainerStarted","Data":"eaf7b259a9b8f47f33d5b2cfcdd87eb3c7848630245c604dd673f2e50665d533"} Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.451556 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.451575 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.451585 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.451593 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.451601 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31343c55-58c4-4c6a-854c-36bc13beb817-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.457744 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" event={"ID":"409d892b-c223-43a8-9550-ff50fb759e2b","Type":"ContainerStarted","Data":"b51489a87cb59a83deb39652c5d9471a460958b788d0c6b660b75d1ed64a725c"} Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.458595 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.487683 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerStarted","Data":"3cad0288c8883e37f9ca31fae7c2654dffef18ae90c1be8acf5229ea3e1c1f0b"} Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.492493 4744 generic.go:334] "Generic (PLEG): container finished" podID="31343c55-58c4-4c6a-854c-36bc13beb817" containerID="ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955" exitCode=0 Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.492564 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" event={"ID":"31343c55-58c4-4c6a-854c-36bc13beb817","Type":"ContainerDied","Data":"ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955"} Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.492572 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.492602 4744 scope.go:117] "RemoveContainer" containerID="ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.492588 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-h2rns" event={"ID":"31343c55-58c4-4c6a-854c-36bc13beb817","Type":"ContainerDied","Data":"0ce05300e7985579f4b47cbb7eb1012b1a845783b6824b15c88a7504b6bf6da9"} Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.529063 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" podStartSLOduration=3.529038937 podStartE2EDuration="3.529038937s" podCreationTimestamp="2025-09-30 03:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:37.474606168 +0000 UTC m=+1144.647826132" watchObservedRunningTime="2025-09-30 03:13:37.529038937 +0000 UTC m=+1144.702258911" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.534590 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-h2rns"] Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.565804 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-h2rns"] Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.574572 4744 scope.go:117] "RemoveContainer" containerID="9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.624092 4744 scope.go:117] "RemoveContainer" containerID="ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955" Sep 30 03:13:37 crc kubenswrapper[4744]: E0930 03:13:37.624656 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955\": container with ID starting with ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955 not found: ID does not exist" containerID="ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.624706 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955"} err="failed to get container status \"ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955\": rpc error: code = NotFound desc = could not find container \"ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955\": container with ID starting with ae6a19dc3e4aef2c95b7f8bbcc9ff4523d716262094404d7af16c8f2354ce955 not found: ID does not exist" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.624735 4744 scope.go:117] "RemoveContainer" containerID="9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f" Sep 30 03:13:37 crc kubenswrapper[4744]: E0930 03:13:37.624969 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f\": container with ID starting with 9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f not found: ID does not exist" containerID="9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f" Sep 30 03:13:37 crc kubenswrapper[4744]: I0930 03:13:37.624995 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f"} err="failed to get container status \"9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f\": rpc error: code = NotFound desc = could not find container \"9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f\": container with ID starting with 9727f7a5b56362511d05342653846bfbe62a188ffd1d04c20039b6156a05178f not found: ID does not exist" Sep 30 03:13:38 crc kubenswrapper[4744]: W0930 03:13:38.043685 4744 watcher.go:93] Error while processing event ("/sys/fs/cgroup/user.slice/user-0.slice/session-c6.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/user.slice/user-0.slice/session-c6.scope: no such file or directory Sep 30 03:13:38 crc kubenswrapper[4744]: W0930 03:13:38.059321 4744 watcher.go:93] Error while processing event ("/sys/fs/cgroup/user.slice/user-0.slice/session-c7.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/user.slice/user-0.slice/session-c7.scope: no such file or directory Sep 30 03:13:38 crc kubenswrapper[4744]: W0930 03:13:38.059355 4744 watcher.go:93] Error while processing event ("/sys/fs/cgroup/user.slice/user-0.slice/session-c8.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/user.slice/user-0.slice/session-c8.scope: no such file or directory Sep 30 03:13:38 crc kubenswrapper[4744]: W0930 03:13:38.060044 4744 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod409d892b_c223_43a8_9550_ff50fb759e2b.slice/crio-conmon-0d1a2eb3851b60a0e03db74addc8f9de94facfb9132f4748e75576488014d6b4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod409d892b_c223_43a8_9550_ff50fb759e2b.slice/crio-conmon-0d1a2eb3851b60a0e03db74addc8f9de94facfb9132f4748e75576488014d6b4.scope: no such file or directory Sep 30 03:13:38 crc kubenswrapper[4744]: W0930 03:13:38.060146 4744 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod409d892b_c223_43a8_9550_ff50fb759e2b.slice/crio-0d1a2eb3851b60a0e03db74addc8f9de94facfb9132f4748e75576488014d6b4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod409d892b_c223_43a8_9550_ff50fb759e2b.slice/crio-0d1a2eb3851b60a0e03db74addc8f9de94facfb9132f4748e75576488014d6b4.scope: no such file or directory Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.482800 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.532624 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8410c494-56fd-4498-ad12-e0c6dad119bf","Type":"ContainerStarted","Data":"805d11c22078c31436883518ab8445d3e95bab0a2c52d61452efb4cae229919d"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.532664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8410c494-56fd-4498-ad12-e0c6dad119bf","Type":"ContainerStarted","Data":"043bd75c3b4314935500113eb155dc3a84be9e5f995dca57714a7262128780e2"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.539767 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5269829d-b1f7-4980-9550-d622fa40c1f1","Type":"ContainerStarted","Data":"2da1e6147331c0dffdb106ffb76a9adfef786294c8a3b7f56881895840d039dd"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.540538 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.544759 4744 generic.go:334] "Generic (PLEG): container finished" podID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerID="eac00480fcc09aea56d5c80100cc25aebb5eafdb2a9df36c2149d681a3e4665a" exitCode=137 Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.544781 4744 generic.go:334] "Generic (PLEG): container finished" podID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerID="a10058c3b49add8e4fe84a75fecff0f0ea29107b6cd728fe3c1119456e886646" exitCode=137 Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.544810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796557ff95-kphjm" event={"ID":"aa042742-a24d-4cf6-aecf-20b41b3287b4","Type":"ContainerDied","Data":"eac00480fcc09aea56d5c80100cc25aebb5eafdb2a9df36c2149d681a3e4665a"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.544826 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796557ff95-kphjm" event={"ID":"aa042742-a24d-4cf6-aecf-20b41b3287b4","Type":"ContainerDied","Data":"a10058c3b49add8e4fe84a75fecff0f0ea29107b6cd728fe3c1119456e886646"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.565522 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"837b73b3-fd73-48a3-888c-874871f8ce4c","Type":"ContainerStarted","Data":"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.566243 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.568225 4744 generic.go:334] "Generic (PLEG): container finished" podID="66d32c30-b69c-4637-97a9-c1112b954a92" containerID="16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291" exitCode=137 Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.568248 4744 generic.go:334] "Generic (PLEG): container finished" podID="66d32c30-b69c-4637-97a9-c1112b954a92" containerID="7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a" exitCode=137 Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.568275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758dd45f85-fgxnt" event={"ID":"66d32c30-b69c-4637-97a9-c1112b954a92","Type":"ContainerDied","Data":"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.568292 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758dd45f85-fgxnt" event={"ID":"66d32c30-b69c-4637-97a9-c1112b954a92","Type":"ContainerDied","Data":"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.568300 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758dd45f85-fgxnt" event={"ID":"66d32c30-b69c-4637-97a9-c1112b954a92","Type":"ContainerDied","Data":"299c19b6853a008074c6f722837c470a495d6d2854505b1e154d1de45e813530"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.568315 4744 scope.go:117] "RemoveContainer" containerID="16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.568384 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758dd45f85-fgxnt" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.572109 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.572100009 podStartE2EDuration="4.572100009s" podCreationTimestamp="2025-09-30 03:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:38.568806316 +0000 UTC m=+1145.742026290" watchObservedRunningTime="2025-09-30 03:13:38.572100009 +0000 UTC m=+1145.745319983" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.572952 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.66258151 podStartE2EDuration="4.572948815s" podCreationTimestamp="2025-09-30 03:13:34 +0000 UTC" firstStartedPulling="2025-09-30 03:13:35.883136091 +0000 UTC m=+1143.056356065" lastFinishedPulling="2025-09-30 03:13:36.793503396 +0000 UTC m=+1143.966723370" observedRunningTime="2025-09-30 03:13:38.555674999 +0000 UTC m=+1145.728894973" watchObservedRunningTime="2025-09-30 03:13:38.572948815 +0000 UTC m=+1145.746168789" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.583641 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerStarted","Data":"dbc389ec3e3bf40949ca935842c9ff6ed70c16d1ee56e9a1edc4ae9a27014b5d"} Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.583674 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.591245 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d32c30-b69c-4637-97a9-c1112b954a92-logs\") pod \"66d32c30-b69c-4637-97a9-c1112b954a92\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.591287 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-config-data\") pod \"66d32c30-b69c-4637-97a9-c1112b954a92\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.591435 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-scripts\") pod \"66d32c30-b69c-4637-97a9-c1112b954a92\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.591522 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66d32c30-b69c-4637-97a9-c1112b954a92-horizon-secret-key\") pod \"66d32c30-b69c-4637-97a9-c1112b954a92\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.591562 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59wrc\" (UniqueName: \"kubernetes.io/projected/66d32c30-b69c-4637-97a9-c1112b954a92-kube-api-access-59wrc\") pod \"66d32c30-b69c-4637-97a9-c1112b954a92\" (UID: \"66d32c30-b69c-4637-97a9-c1112b954a92\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.594787 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.594773362 podStartE2EDuration="4.594773362s" podCreationTimestamp="2025-09-30 03:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:38.593809752 +0000 UTC m=+1145.767029726" watchObservedRunningTime="2025-09-30 03:13:38.594773362 +0000 UTC m=+1145.767993336" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.599207 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d32c30-b69c-4637-97a9-c1112b954a92-kube-api-access-59wrc" (OuterVolumeSpecName: "kube-api-access-59wrc") pod "66d32c30-b69c-4637-97a9-c1112b954a92" (UID: "66d32c30-b69c-4637-97a9-c1112b954a92"). InnerVolumeSpecName "kube-api-access-59wrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.601842 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d32c30-b69c-4637-97a9-c1112b954a92-logs" (OuterVolumeSpecName: "logs") pod "66d32c30-b69c-4637-97a9-c1112b954a92" (UID: "66d32c30-b69c-4637-97a9-c1112b954a92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.608063 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d32c30-b69c-4637-97a9-c1112b954a92-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "66d32c30-b69c-4637-97a9-c1112b954a92" (UID: "66d32c30-b69c-4637-97a9-c1112b954a92"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.618993 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.636381 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.874184422 podStartE2EDuration="6.636350032s" podCreationTimestamp="2025-09-30 03:13:32 +0000 UTC" firstStartedPulling="2025-09-30 03:13:33.099904539 +0000 UTC m=+1140.273124513" lastFinishedPulling="2025-09-30 03:13:37.862070149 +0000 UTC m=+1145.035290123" observedRunningTime="2025-09-30 03:13:38.619353635 +0000 UTC m=+1145.792573609" watchObservedRunningTime="2025-09-30 03:13:38.636350032 +0000 UTC m=+1145.809570006" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.647468 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-scripts" (OuterVolumeSpecName: "scripts") pod "66d32c30-b69c-4637-97a9-c1112b954a92" (UID: "66d32c30-b69c-4637-97a9-c1112b954a92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.662990 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-config-data" (OuterVolumeSpecName: "config-data") pod "66d32c30-b69c-4637-97a9-c1112b954a92" (UID: "66d32c30-b69c-4637-97a9-c1112b954a92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.693392 4744 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66d32c30-b69c-4637-97a9-c1112b954a92-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.693426 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59wrc\" (UniqueName: \"kubernetes.io/projected/66d32c30-b69c-4637-97a9-c1112b954a92-kube-api-access-59wrc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.693438 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66d32c30-b69c-4637-97a9-c1112b954a92-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.693447 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.693455 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d32c30-b69c-4637-97a9-c1112b954a92-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.775306 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.794443 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa042742-a24d-4cf6-aecf-20b41b3287b4-logs\") pod \"aa042742-a24d-4cf6-aecf-20b41b3287b4\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.794512 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmvg\" (UniqueName: \"kubernetes.io/projected/aa042742-a24d-4cf6-aecf-20b41b3287b4-kube-api-access-6rmvg\") pod \"aa042742-a24d-4cf6-aecf-20b41b3287b4\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.794593 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-scripts\") pod \"aa042742-a24d-4cf6-aecf-20b41b3287b4\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.794661 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa042742-a24d-4cf6-aecf-20b41b3287b4-horizon-secret-key\") pod \"aa042742-a24d-4cf6-aecf-20b41b3287b4\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.794706 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-config-data\") pod \"aa042742-a24d-4cf6-aecf-20b41b3287b4\" (UID: \"aa042742-a24d-4cf6-aecf-20b41b3287b4\") " Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.795300 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa042742-a24d-4cf6-aecf-20b41b3287b4-logs" (OuterVolumeSpecName: "logs") pod "aa042742-a24d-4cf6-aecf-20b41b3287b4" (UID: "aa042742-a24d-4cf6-aecf-20b41b3287b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.805487 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa042742-a24d-4cf6-aecf-20b41b3287b4-kube-api-access-6rmvg" (OuterVolumeSpecName: "kube-api-access-6rmvg") pod "aa042742-a24d-4cf6-aecf-20b41b3287b4" (UID: "aa042742-a24d-4cf6-aecf-20b41b3287b4"). InnerVolumeSpecName "kube-api-access-6rmvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.811014 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa042742-a24d-4cf6-aecf-20b41b3287b4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "aa042742-a24d-4cf6-aecf-20b41b3287b4" (UID: "aa042742-a24d-4cf6-aecf-20b41b3287b4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.818518 4744 scope.go:117] "RemoveContainer" containerID="7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.828531 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-scripts" (OuterVolumeSpecName: "scripts") pod "aa042742-a24d-4cf6-aecf-20b41b3287b4" (UID: "aa042742-a24d-4cf6-aecf-20b41b3287b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.839096 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-config-data" (OuterVolumeSpecName: "config-data") pod "aa042742-a24d-4cf6-aecf-20b41b3287b4" (UID: "aa042742-a24d-4cf6-aecf-20b41b3287b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.870564 4744 scope.go:117] "RemoveContainer" containerID="16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291" Sep 30 03:13:38 crc kubenswrapper[4744]: E0930 03:13:38.871012 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291\": container with ID starting with 16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291 not found: ID does not exist" containerID="16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.871048 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291"} err="failed to get container status \"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291\": rpc error: code = NotFound desc = could not find container \"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291\": container with ID starting with 16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291 not found: ID does not exist" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.871075 4744 scope.go:117] "RemoveContainer" containerID="7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a" Sep 30 03:13:38 crc kubenswrapper[4744]: E0930 03:13:38.872124 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a\": container with ID starting with 7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a not found: ID does not exist" containerID="7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.872147 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a"} err="failed to get container status \"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a\": rpc error: code = NotFound desc = could not find container \"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a\": container with ID starting with 7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a not found: ID does not exist" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.872159 4744 scope.go:117] "RemoveContainer" containerID="16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.873094 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291"} err="failed to get container status \"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291\": rpc error: code = NotFound desc = could not find container \"16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291\": container with ID starting with 16b306b58847a519914ddc17a58357e44abfe332acf7a2d8bd6f4d6140198291 not found: ID does not exist" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.873115 4744 scope.go:117] "RemoveContainer" containerID="7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.873357 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a"} err="failed to get container status \"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a\": rpc error: code = NotFound desc = could not find container \"7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a\": container with ID starting with 7b2b9fcd28f8d7d1660dda4361f4a5b288b9917da68cdbd8e396ed9a53e1722a not found: ID does not exist" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.898563 4744 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa042742-a24d-4cf6-aecf-20b41b3287b4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.898599 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.898607 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa042742-a24d-4cf6-aecf-20b41b3287b4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.898616 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rmvg\" (UniqueName: \"kubernetes.io/projected/aa042742-a24d-4cf6-aecf-20b41b3287b4-kube-api-access-6rmvg\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.898627 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa042742-a24d-4cf6-aecf-20b41b3287b4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:38 crc kubenswrapper[4744]: I0930 03:13:38.993140 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758dd45f85-fgxnt"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.002396 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-758dd45f85-fgxnt"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.008616 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.093032 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5578f9874f-7lb9c" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.155944 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dd595ddb6-2wvzb"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.156447 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dd595ddb6-2wvzb" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-api" containerID="cri-o://a8cf03771592eff4cb375905c13d671076f830af09c4e80ec6fcced63bc535eb" gracePeriod=30 Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.157537 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5dd595ddb6-2wvzb" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-httpd" containerID="cri-o://8c3941b8702915ec6b162e1e49b1276a4d9ca47032d0c110593402c9b0f7311e" gracePeriod=30 Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.159352 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.211982 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.219832 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.264947 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.335889 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.528151 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31343c55-58c4-4c6a-854c-36bc13beb817" path="/var/lib/kubelet/pods/31343c55-58c4-4c6a-854c-36bc13beb817/volumes" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.529018 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" path="/var/lib/kubelet/pods/66d32c30-b69c-4637-97a9-c1112b954a92/volumes" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.635091 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b554c468b-9gtqj"] Sep 30 03:13:39 crc kubenswrapper[4744]: E0930 03:13:39.635856 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon-log" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.635876 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon-log" Sep 30 03:13:39 crc kubenswrapper[4744]: E0930 03:13:39.635899 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31343c55-58c4-4c6a-854c-36bc13beb817" containerName="dnsmasq-dns" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.635908 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="31343c55-58c4-4c6a-854c-36bc13beb817" containerName="dnsmasq-dns" Sep 30 03:13:39 crc kubenswrapper[4744]: E0930 03:13:39.635922 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon-log" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.635929 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon-log" Sep 30 03:13:39 crc kubenswrapper[4744]: E0930 03:13:39.635937 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.635943 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon" Sep 30 03:13:39 crc kubenswrapper[4744]: E0930 03:13:39.635961 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.635968 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon" Sep 30 03:13:39 crc kubenswrapper[4744]: E0930 03:13:39.635986 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31343c55-58c4-4c6a-854c-36bc13beb817" containerName="init" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.635994 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="31343c55-58c4-4c6a-854c-36bc13beb817" containerName="init" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.636340 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="31343c55-58c4-4c6a-854c-36bc13beb817" containerName="dnsmasq-dns" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.636548 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.636563 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d32c30-b69c-4637-97a9-c1112b954a92" containerName="horizon-log" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.636578 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon-log" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.636592 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" containerName="horizon" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.640404 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.643930 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796557ff95-kphjm" event={"ID":"aa042742-a24d-4cf6-aecf-20b41b3287b4","Type":"ContainerDied","Data":"636efd01de4d3323befef167f6d75ea8d5faaaa8707bbaaff82ca7a8bd183d68"} Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.643977 4744 scope.go:117] "RemoveContainer" containerID="eac00480fcc09aea56d5c80100cc25aebb5eafdb2a9df36c2149d681a3e4665a" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.644129 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796557ff95-kphjm" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.649431 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.652119 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.659594 4744 generic.go:334] "Generic (PLEG): container finished" podID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerID="8c3941b8702915ec6b162e1e49b1276a4d9ca47032d0c110593402c9b0f7311e" exitCode=0 Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.659704 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd595ddb6-2wvzb" event={"ID":"d8d97540-160a-4b25-9a0a-7ee3c27775f3","Type":"ContainerDied","Data":"8c3941b8702915ec6b162e1e49b1276a4d9ca47032d0c110593402c9b0f7311e"} Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.674964 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="probe" containerID="cri-o://31b125f3ecaef0be11faf3853af66f1e39df34490f86fb0a01bbdfc186420ed7" gracePeriod=30 Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.674515 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="cinder-backup" containerID="cri-o://c7b39fca39873d8c9349f9d620bc66e7e66a2c4bba8b884794ecd81a50f63d12" gracePeriod=30 Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.675955 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="cinder-volume" containerID="cri-o://94a7e815f90b3be48a274556c8c29cae05a2c068205fc1c34290a5f4f91301a1" gracePeriod=30 Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.676312 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="probe" containerID="cri-o://c58f92b97d3150ec8bd56d6c1c74d3e1c513a32fb545a802ad231a0ccc810148" gracePeriod=30 Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.679573 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b554c468b-9gtqj"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.772497 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.786699 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-796557ff95-kphjm"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.796027 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-796557ff95-kphjm"] Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.826517 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-config-data\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.826654 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-internal-tls-certs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.826710 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-config-data-custom\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.826787 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-logs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.826838 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-combined-ca-bundle\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.826868 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44ns\" (UniqueName: \"kubernetes.io/projected/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-kube-api-access-l44ns\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.826977 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-public-tls-certs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.924997 4744 scope.go:117] "RemoveContainer" containerID="a10058c3b49add8e4fe84a75fecff0f0ea29107b6cd728fe3c1119456e886646" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.929404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-internal-tls-certs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.929453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-config-data-custom\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.929504 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-logs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.929541 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-combined-ca-bundle\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.929562 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44ns\" (UniqueName: \"kubernetes.io/projected/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-kube-api-access-l44ns\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.929595 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-public-tls-certs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.929662 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-config-data\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.932394 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-logs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.935870 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-internal-tls-certs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.935935 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-config-data\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.951175 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-combined-ca-bundle\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.951412 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-config-data-custom\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.957562 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44ns\" (UniqueName: \"kubernetes.io/projected/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-kube-api-access-l44ns\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.965956 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf5b5e2-d32c-4714-864b-06e2f15dd3ce-public-tls-certs\") pod \"barbican-api-6b554c468b-9gtqj\" (UID: \"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce\") " pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:39 crc kubenswrapper[4744]: I0930 03:13:39.983895 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.505453 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b554c468b-9gtqj"] Sep 30 03:13:40 crc kubenswrapper[4744]: W0930 03:13:40.507597 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbf5b5e2_d32c_4714_864b_06e2f15dd3ce.slice/crio-46aec6112df40a00206aecaa05c5229d8f9993b308190ad476fd6fb88431e0a3 WatchSource:0}: Error finding container 46aec6112df40a00206aecaa05c5229d8f9993b308190ad476fd6fb88431e0a3: Status 404 returned error can't find the container with id 46aec6112df40a00206aecaa05c5229d8f9993b308190ad476fd6fb88431e0a3 Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.687492 4744 generic.go:334] "Generic (PLEG): container finished" podID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerID="c58f92b97d3150ec8bd56d6c1c74d3e1c513a32fb545a802ad231a0ccc810148" exitCode=0 Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.687529 4744 generic.go:334] "Generic (PLEG): container finished" podID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerID="94a7e815f90b3be48a274556c8c29cae05a2c068205fc1c34290a5f4f91301a1" exitCode=0 Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.687580 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"0d5e764c-92e1-48ab-a324-a8bac2865222","Type":"ContainerDied","Data":"c58f92b97d3150ec8bd56d6c1c74d3e1c513a32fb545a802ad231a0ccc810148"} Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.687610 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"0d5e764c-92e1-48ab-a324-a8bac2865222","Type":"ContainerDied","Data":"94a7e815f90b3be48a274556c8c29cae05a2c068205fc1c34290a5f4f91301a1"} Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.689046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b554c468b-9gtqj" event={"ID":"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce","Type":"ContainerStarted","Data":"46aec6112df40a00206aecaa05c5229d8f9993b308190ad476fd6fb88431e0a3"} Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.692307 4744 generic.go:334] "Generic (PLEG): container finished" podID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerID="31b125f3ecaef0be11faf3853af66f1e39df34490f86fb0a01bbdfc186420ed7" exitCode=0 Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.692440 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7854f055-f44d-4abb-89a1-c11c436fc5bd","Type":"ContainerDied","Data":"31b125f3ecaef0be11faf3853af66f1e39df34490f86fb0a01bbdfc186420ed7"} Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.692612 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api" containerID="cri-o://a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48" gracePeriod=30 Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.692699 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api-log" containerID="cri-o://2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7" gracePeriod=30 Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.692745 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="cinder-scheduler" containerID="cri-o://418df48d044b0a8cc494017001bf3ece991bc433a8a912333d4b59ea1cd5fabb" gracePeriod=30 Sep 30 03:13:40 crc kubenswrapper[4744]: I0930 03:13:40.692904 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="probe" containerID="cri-o://b3be161428522859d0f371e5cf866cfc5cd68d85d26265fdd82f1883f371573c" gracePeriod=30 Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.230590 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.341079 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379637 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-nvme\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379732 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-iscsi\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379768 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379808 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-sys\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379839 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-brick\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379858 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-machine-id\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379877 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-combined-ca-bundle\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379905 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-dev\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379964 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data-custom\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.379995 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45svz\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-kube-api-access-45svz\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380021 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-lib-cinder\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380063 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-scripts\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380087 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-cinder\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380115 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-run\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380170 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-lib-modules\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380235 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-ceph\") pod \"0d5e764c-92e1-48ab-a324-a8bac2865222\" (UID: \"0d5e764c-92e1-48ab-a324-a8bac2865222\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380427 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-dev" (OuterVolumeSpecName: "dev") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380473 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380493 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380769 4744 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-nvme\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380796 4744 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-iscsi\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.380810 4744 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-dev\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.382534 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-sys" (OuterVolumeSpecName: "sys") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.382567 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.382586 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.382847 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.382908 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-run" (OuterVolumeSpecName: "run") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.382931 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.382967 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.385784 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-ceph" (OuterVolumeSpecName: "ceph") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.388599 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-kube-api-access-45svz" (OuterVolumeSpecName: "kube-api-access-45svz") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "kube-api-access-45svz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.391042 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.391615 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.398053 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-scripts" (OuterVolumeSpecName: "scripts") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.459476 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482332 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482357 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482380 4744 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482389 4744 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-lib-modules\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482397 4744 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482406 4744 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-sys\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482413 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-locks-brick\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482423 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482433 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482441 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482449 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45svz\" (UniqueName: \"kubernetes.io/projected/0d5e764c-92e1-48ab-a324-a8bac2865222-kube-api-access-45svz\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.482457 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d5e764c-92e1-48ab-a324-a8bac2865222-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.519246 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa042742-a24d-4cf6-aecf-20b41b3287b4" path="/var/lib/kubelet/pods/aa042742-a24d-4cf6-aecf-20b41b3287b4/volumes" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.524674 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data" (OuterVolumeSpecName: "config-data") pod "0d5e764c-92e1-48ab-a324-a8bac2865222" (UID: "0d5e764c-92e1-48ab-a324-a8bac2865222"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.583176 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6272\" (UniqueName: \"kubernetes.io/projected/837b73b3-fd73-48a3-888c-874871f8ce4c-kube-api-access-w6272\") pod \"837b73b3-fd73-48a3-888c-874871f8ce4c\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.583255 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-combined-ca-bundle\") pod \"837b73b3-fd73-48a3-888c-874871f8ce4c\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.583334 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data-custom\") pod \"837b73b3-fd73-48a3-888c-874871f8ce4c\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.583400 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data\") pod \"837b73b3-fd73-48a3-888c-874871f8ce4c\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.583439 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-scripts\") pod \"837b73b3-fd73-48a3-888c-874871f8ce4c\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.583513 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837b73b3-fd73-48a3-888c-874871f8ce4c-logs\") pod \"837b73b3-fd73-48a3-888c-874871f8ce4c\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.583653 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/837b73b3-fd73-48a3-888c-874871f8ce4c-etc-machine-id\") pod \"837b73b3-fd73-48a3-888c-874871f8ce4c\" (UID: \"837b73b3-fd73-48a3-888c-874871f8ce4c\") " Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.584109 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d5e764c-92e1-48ab-a324-a8bac2865222-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.584169 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/837b73b3-fd73-48a3-888c-874871f8ce4c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "837b73b3-fd73-48a3-888c-874871f8ce4c" (UID: "837b73b3-fd73-48a3-888c-874871f8ce4c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.584633 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837b73b3-fd73-48a3-888c-874871f8ce4c-logs" (OuterVolumeSpecName: "logs") pod "837b73b3-fd73-48a3-888c-874871f8ce4c" (UID: "837b73b3-fd73-48a3-888c-874871f8ce4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.589784 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837b73b3-fd73-48a3-888c-874871f8ce4c-kube-api-access-w6272" (OuterVolumeSpecName: "kube-api-access-w6272") pod "837b73b3-fd73-48a3-888c-874871f8ce4c" (UID: "837b73b3-fd73-48a3-888c-874871f8ce4c"). InnerVolumeSpecName "kube-api-access-w6272". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.592283 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-scripts" (OuterVolumeSpecName: "scripts") pod "837b73b3-fd73-48a3-888c-874871f8ce4c" (UID: "837b73b3-fd73-48a3-888c-874871f8ce4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.621479 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "837b73b3-fd73-48a3-888c-874871f8ce4c" (UID: "837b73b3-fd73-48a3-888c-874871f8ce4c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.638514 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "837b73b3-fd73-48a3-888c-874871f8ce4c" (UID: "837b73b3-fd73-48a3-888c-874871f8ce4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.663881 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.673563 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data" (OuterVolumeSpecName: "config-data") pod "837b73b3-fd73-48a3-888c-874871f8ce4c" (UID: "837b73b3-fd73-48a3-888c-874871f8ce4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.685354 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/837b73b3-fd73-48a3-888c-874871f8ce4c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.685400 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6272\" (UniqueName: \"kubernetes.io/projected/837b73b3-fd73-48a3-888c-874871f8ce4c-kube-api-access-w6272\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.685412 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.685422 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.685432 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.685439 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837b73b3-fd73-48a3-888c-874871f8ce4c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.685450 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837b73b3-fd73-48a3-888c-874871f8ce4c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.721592 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"0d5e764c-92e1-48ab-a324-a8bac2865222","Type":"ContainerDied","Data":"96de829a73993b5aa5d4c3406c4d263a305c67d25f29300538feb2463aa70f15"} Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.721648 4744 scope.go:117] "RemoveContainer" containerID="c58f92b97d3150ec8bd56d6c1c74d3e1c513a32fb545a802ad231a0ccc810148" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.721751 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.726329 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b554c468b-9gtqj" event={"ID":"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce","Type":"ContainerStarted","Data":"37b751b1b5bbe407f1edb7d45900a183666dc07a1d994d6410e1f54bbc2ed6c5"} Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.726384 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b554c468b-9gtqj" event={"ID":"fbf5b5e2-d32c-4714-864b-06e2f15dd3ce","Type":"ContainerStarted","Data":"d36057576cf696a1e8baa87ac255909dd423cd9f5070a482170f8ca77b9336fb"} Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.727217 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.727236 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.755504 4744 generic.go:334] "Generic (PLEG): container finished" podID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerID="a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48" exitCode=0 Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.755534 4744 generic.go:334] "Generic (PLEG): container finished" podID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerID="2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7" exitCode=143 Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.755612 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"837b73b3-fd73-48a3-888c-874871f8ce4c","Type":"ContainerDied","Data":"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48"} Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.755653 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.755667 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"837b73b3-fd73-48a3-888c-874871f8ce4c","Type":"ContainerDied","Data":"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7"} Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.755681 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"837b73b3-fd73-48a3-888c-874871f8ce4c","Type":"ContainerDied","Data":"774899f4a62f1ed7307a0fa11625ee9b6d1f9f01b490b96f1640972cdcffde8f"} Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.761272 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b554c468b-9gtqj" podStartSLOduration=2.761253975 podStartE2EDuration="2.761253975s" podCreationTimestamp="2025-09-30 03:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:41.7553067 +0000 UTC m=+1148.928526674" watchObservedRunningTime="2025-09-30 03:13:41.761253975 +0000 UTC m=+1148.934473949" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.761328 4744 generic.go:334] "Generic (PLEG): container finished" podID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerID="b3be161428522859d0f371e5cf866cfc5cd68d85d26265fdd82f1883f371573c" exitCode=0 Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.761357 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea1a6e80-d761-4986-b5e1-b6f557bb65b2","Type":"ContainerDied","Data":"b3be161428522859d0f371e5cf866cfc5cd68d85d26265fdd82f1883f371573c"} Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.776667 4744 scope.go:117] "RemoveContainer" containerID="94a7e815f90b3be48a274556c8c29cae05a2c068205fc1c34290a5f4f91301a1" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.949694 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.959354 4744 scope.go:117] "RemoveContainer" containerID="a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.976964 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:13:41 crc kubenswrapper[4744]: I0930 03:13:41.996474 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.015913 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.015956 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.033524 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:42 crc kubenswrapper[4744]: E0930 03:13:42.041947 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="probe" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.041979 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="probe" Sep 30 03:13:42 crc kubenswrapper[4744]: E0930 03:13:42.042013 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.042020 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api" Sep 30 03:13:42 crc kubenswrapper[4744]: E0930 03:13:42.042040 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="cinder-volume" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.042046 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="cinder-volume" Sep 30 03:13:42 crc kubenswrapper[4744]: E0930 03:13:42.042076 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api-log" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.042082 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api-log" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.060939 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="cinder-volume" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.060992 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api-log" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.061023 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" containerName="manila-api" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.061037 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" containerName="probe" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.083836 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.086715 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.088209 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.089855 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.092935 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.093051 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.093298 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.093262 4744 scope.go:117] "RemoveContainer" containerID="2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.100612 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.100692 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126227 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126269 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126293 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2tnk\" (UniqueName: \"kubernetes.io/projected/a1d320da-1463-4d51-beff-da49872cdb35-kube-api-access-l2tnk\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126323 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-sys\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126337 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126458 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126494 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d320da-1463-4d51-beff-da49872cdb35-etc-machine-id\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126513 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-config-data-custom\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126540 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126570 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126665 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-internal-tls-certs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126703 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-run\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.126724 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-public-tls-certs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127283 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-config-data\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127297 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1d320da-1463-4d51-beff-da49872cdb35-logs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127315 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127383 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127401 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127420 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/597b8dc3-9c8f-48c4-b554-7d8564395142-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127445 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-dev\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127463 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-scripts\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127479 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.127494 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjn9\" (UniqueName: \"kubernetes.io/projected/597b8dc3-9c8f-48c4-b554-7d8564395142-kube-api-access-4sjn9\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.155821 4744 scope.go:117] "RemoveContainer" containerID="a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48" Sep 30 03:13:42 crc kubenswrapper[4744]: E0930 03:13:42.156399 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48\": container with ID starting with a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48 not found: ID does not exist" containerID="a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.156446 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48"} err="failed to get container status \"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48\": rpc error: code = NotFound desc = could not find container \"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48\": container with ID starting with a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48 not found: ID does not exist" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.156470 4744 scope.go:117] "RemoveContainer" containerID="2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7" Sep 30 03:13:42 crc kubenswrapper[4744]: E0930 03:13:42.159526 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7\": container with ID starting with 2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7 not found: ID does not exist" containerID="2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.159557 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7"} err="failed to get container status \"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7\": rpc error: code = NotFound desc = could not find container \"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7\": container with ID starting with 2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7 not found: ID does not exist" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.159576 4744 scope.go:117] "RemoveContainer" containerID="a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.162125 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48"} err="failed to get container status \"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48\": rpc error: code = NotFound desc = could not find container \"a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48\": container with ID starting with a707f7e8709e0be6e3338023faf1c4a175c73083739d1dda466c44f8624a6c48 not found: ID does not exist" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.162232 4744 scope.go:117] "RemoveContainer" containerID="2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.162546 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7"} err="failed to get container status \"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7\": rpc error: code = NotFound desc = could not find container \"2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7\": container with ID starting with 2b1a47974833001a42344eccd9e77eceb7fee95c16599cd05efdbc9df1f4e8d7 not found: ID does not exist" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229618 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229655 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229679 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/597b8dc3-9c8f-48c4-b554-7d8564395142-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229703 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-dev\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229724 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-scripts\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229741 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229758 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjn9\" (UniqueName: \"kubernetes.io/projected/597b8dc3-9c8f-48c4-b554-7d8564395142-kube-api-access-4sjn9\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229791 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229815 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229836 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2tnk\" (UniqueName: \"kubernetes.io/projected/a1d320da-1463-4d51-beff-da49872cdb35-kube-api-access-l2tnk\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229857 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-sys\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229872 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229908 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229928 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229943 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d320da-1463-4d51-beff-da49872cdb35-etc-machine-id\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229961 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-config-data-custom\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.229989 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230008 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230032 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-internal-tls-certs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230048 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-run\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230067 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-public-tls-certs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230081 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230115 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-config-data\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230133 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1d320da-1463-4d51-beff-da49872cdb35-logs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.230528 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.231567 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.231621 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.233381 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.233635 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.233676 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.233697 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-sys\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.233851 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-run\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.234104 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1d320da-1463-4d51-beff-da49872cdb35-logs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.234154 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.243915 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a1d320da-1463-4d51-beff-da49872cdb35-etc-machine-id\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.243976 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/597b8dc3-9c8f-48c4-b554-7d8564395142-dev\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.250661 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/597b8dc3-9c8f-48c4-b554-7d8564395142-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: E0930 03:13:42.254971 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d5e764c_92e1_48ab_a324_a8bac2865222.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d5e764c_92e1_48ab_a324_a8bac2865222.slice/crio-96de829a73993b5aa5d4c3406c4d263a305c67d25f29300538feb2463aa70f15\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod837b73b3_fd73_48a3_888c_874871f8ce4c.slice/crio-774899f4a62f1ed7307a0fa11625ee9b6d1f9f01b490b96f1640972cdcffde8f\": RecentStats: unable to find data in memory cache]" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.256356 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.256985 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-scripts\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.257134 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-config-data\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.257486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-public-tls-certs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.257916 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-config-data-custom\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.258623 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.265867 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.272008 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2tnk\" (UniqueName: \"kubernetes.io/projected/a1d320da-1463-4d51-beff-da49872cdb35-kube-api-access-l2tnk\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.276919 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-internal-tls-certs\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.277443 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d320da-1463-4d51-beff-da49872cdb35-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a1d320da-1463-4d51-beff-da49872cdb35\") " pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.282765 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/597b8dc3-9c8f-48c4-b554-7d8564395142-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.282942 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjn9\" (UniqueName: \"kubernetes.io/projected/597b8dc3-9c8f-48c4-b554-7d8564395142-kube-api-access-4sjn9\") pod \"cinder-volume-volume1-0\" (UID: \"597b8dc3-9c8f-48c4-b554-7d8564395142\") " pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.433255 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.442529 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.450685 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.956073 4744 generic.go:334] "Generic (PLEG): container finished" podID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerID="418df48d044b0a8cc494017001bf3ece991bc433a8a912333d4b59ea1cd5fabb" exitCode=0 Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.956434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea1a6e80-d761-4986-b5e1-b6f557bb65b2","Type":"ContainerDied","Data":"418df48d044b0a8cc494017001bf3ece991bc433a8a912333d4b59ea1cd5fabb"} Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.974378 4744 generic.go:334] "Generic (PLEG): container finished" podID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerID="c7b39fca39873d8c9349f9d620bc66e7e66a2c4bba8b884794ecd81a50f63d12" exitCode=0 Sep 30 03:13:42 crc kubenswrapper[4744]: I0930 03:13:42.974445 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7854f055-f44d-4abb-89a1-c11c436fc5bd","Type":"ContainerDied","Data":"c7b39fca39873d8c9349f9d620bc66e7e66a2c4bba8b884794ecd81a50f63d12"} Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.149411 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.234430 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.261988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-iscsi\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262048 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-scripts\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262063 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-lib-cinder\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262089 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-machine-id\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262125 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-combined-ca-bundle\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262148 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-nvme\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262200 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data-custom\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262225 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-cinder\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262245 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-brick\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262258 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-lib-modules\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262276 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrzw9\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-kube-api-access-xrzw9\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262385 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262398 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-run\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262421 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-ceph\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262476 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-sys\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262499 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-dev\") pod \"7854f055-f44d-4abb-89a1-c11c436fc5bd\" (UID: \"7854f055-f44d-4abb-89a1-c11c436fc5bd\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262915 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-dev" (OuterVolumeSpecName: "dev") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.262945 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.267472 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.267531 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.268712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.268735 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.276163 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-scripts" (OuterVolumeSpecName: "scripts") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.276835 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.276869 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-run" (OuterVolumeSpecName: "run") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.277839 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.277884 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-sys" (OuterVolumeSpecName: "sys") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.295408 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-ceph" (OuterVolumeSpecName: "ceph") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.303805 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-kube-api-access-xrzw9" (OuterVolumeSpecName: "kube-api-access-xrzw9") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "kube-api-access-xrzw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.322120 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.362104 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366145 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-brick\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366176 4744 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-lib-modules\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366185 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrzw9\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-kube-api-access-xrzw9\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366197 4744 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366205 4744 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7854f055-f44d-4abb-89a1-c11c436fc5bd-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366212 4744 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-sys\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366220 4744 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-dev\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366228 4744 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-iscsi\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366236 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366248 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366256 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366264 4744 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-etc-nvme\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366274 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.366282 4744 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7854f055-f44d-4abb-89a1-c11c436fc5bd-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.467335 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.469001 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.496104 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.533089 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5e764c-92e1-48ab-a324-a8bac2865222" path="/var/lib/kubelet/pods/0d5e764c-92e1-48ab-a324-a8bac2865222/volumes" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.533980 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837b73b3-fd73-48a3-888c-874871f8ce4c" path="/var/lib/kubelet/pods/837b73b3-fd73-48a3-888c-874871f8ce4c/volumes" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.569911 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data\") pod \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.569980 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt4wl\" (UniqueName: \"kubernetes.io/projected/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-kube-api-access-kt4wl\") pod \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.570156 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-scripts\") pod \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.570205 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-combined-ca-bundle\") pod \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.570252 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data-custom\") pod \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.570285 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-etc-machine-id\") pod \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\" (UID: \"ea1a6e80-d761-4986-b5e1-b6f557bb65b2\") " Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.570778 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ea1a6e80-d761-4986-b5e1-b6f557bb65b2" (UID: "ea1a6e80-d761-4986-b5e1-b6f557bb65b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.571799 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.576257 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-kube-api-access-kt4wl" (OuterVolumeSpecName: "kube-api-access-kt4wl") pod "ea1a6e80-d761-4986-b5e1-b6f557bb65b2" (UID: "ea1a6e80-d761-4986-b5e1-b6f557bb65b2"). InnerVolumeSpecName "kube-api-access-kt4wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.587595 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea1a6e80-d761-4986-b5e1-b6f557bb65b2" (UID: "ea1a6e80-d761-4986-b5e1-b6f557bb65b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.587619 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-scripts" (OuterVolumeSpecName: "scripts") pod "ea1a6e80-d761-4986-b5e1-b6f557bb65b2" (UID: "ea1a6e80-d761-4986-b5e1-b6f557bb65b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.597525 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data" (OuterVolumeSpecName: "config-data") pod "7854f055-f44d-4abb-89a1-c11c436fc5bd" (UID: "7854f055-f44d-4abb-89a1-c11c436fc5bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.676278 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.676306 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.676314 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt4wl\" (UniqueName: \"kubernetes.io/projected/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-kube-api-access-kt4wl\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:43 crc kubenswrapper[4744]: I0930 03:13:43.676323 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7854f055-f44d-4abb-89a1-c11c436fc5bd-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.021834 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea1a6e80-d761-4986-b5e1-b6f557bb65b2" (UID: "ea1a6e80-d761-4986-b5e1-b6f557bb65b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.025727 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea1a6e80-d761-4986-b5e1-b6f557bb65b2","Type":"ContainerDied","Data":"9e302106c75ba319383cc5b2201ec23500db489f634cb6223d62a0e7f4ac2c3a"} Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.025772 4744 scope.go:117] "RemoveContainer" containerID="b3be161428522859d0f371e5cf866cfc5cd68d85d26265fdd82f1883f371573c" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.025879 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.062305 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"7854f055-f44d-4abb-89a1-c11c436fc5bd","Type":"ContainerDied","Data":"dcd2810580f28695f706bb9f701e5f40f7d252904d76b3a7df7ffb730251071a"} Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.062423 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.094342 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a1d320da-1463-4d51-beff-da49872cdb35","Type":"ContainerStarted","Data":"2b6fff6fff0a2f229916dcae985e767823043334df3a66b6784d90b3fb0fcfed"} Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.111009 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.112405 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"597b8dc3-9c8f-48c4-b554-7d8564395142","Type":"ContainerStarted","Data":"af061c796740366173d653a92bb34a2b33dc611d94d243af29415c82ab4a5dd0"} Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.164527 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data" (OuterVolumeSpecName: "config-data") pod "ea1a6e80-d761-4986-b5e1-b6f557bb65b2" (UID: "ea1a6e80-d761-4986-b5e1-b6f557bb65b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.214579 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1a6e80-d761-4986-b5e1-b6f557bb65b2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.240472 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.271724 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7ddc58d856-kwfp8" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.296520 4744 scope.go:117] "RemoveContainer" containerID="418df48d044b0a8cc494017001bf3ece991bc433a8a912333d4b59ea1cd5fabb" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.383422 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.409209 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.420429 4744 scope.go:117] "RemoveContainer" containerID="31b125f3ecaef0be11faf3853af66f1e39df34490f86fb0a01bbdfc186420ed7" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.436978 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: E0930 03:13:44.437381 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="probe" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437392 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="probe" Sep 30 03:13:44 crc kubenswrapper[4744]: E0930 03:13:44.437407 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="probe" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437415 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="probe" Sep 30 03:13:44 crc kubenswrapper[4744]: E0930 03:13:44.437427 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="cinder-backup" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437434 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="cinder-backup" Sep 30 03:13:44 crc kubenswrapper[4744]: E0930 03:13:44.437453 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="cinder-scheduler" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437459 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="cinder-scheduler" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437636 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="cinder-backup" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437647 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" containerName="probe" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437667 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="cinder-scheduler" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.437677 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" containerName="probe" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.445516 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.452499 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.458234 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.475416 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.490426 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.492483 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.495549 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.497689 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520336 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520618 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520635 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-run\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520660 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520693 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520736 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6cc7863-b10e-47a4-bd86-5c66436d4af4-ceph\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520756 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520773 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-scripts\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520786 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520814 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-dev\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520835 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520852 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520871 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2b9\" (UniqueName: \"kubernetes.io/projected/e6cc7863-b10e-47a4-bd86-5c66436d4af4-kube-api-access-6j2b9\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520894 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520909 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-sys\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.520938 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-config-data\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.525627 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.546096 4744 scope.go:117] "RemoveContainer" containerID="c7b39fca39873d8c9349f9d620bc66e7e66a2c4bba8b884794ecd81a50f63d12" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632537 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632598 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6cc7863-b10e-47a4-bd86-5c66436d4af4-ceph\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632619 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632638 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632655 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvw9n\" (UniqueName: \"kubernetes.io/projected/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-kube-api-access-dvw9n\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632675 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-scripts\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632690 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632721 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-dev\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632742 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632759 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2b9\" (UniqueName: \"kubernetes.io/projected/e6cc7863-b10e-47a4-bd86-5c66436d4af4-kube-api-access-6j2b9\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632800 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632819 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-sys\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632851 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-config-data\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632879 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632901 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632914 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-run\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632956 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.632974 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.633007 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.633017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.633047 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.633292 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.634888 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-lib-modules\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.635038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.635538 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.635669 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-run\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.635889 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.636009 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-dev\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.636112 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-sys\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.636267 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e6cc7863-b10e-47a4-bd86-5c66436d4af4-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.646096 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-config-data\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.647708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-scripts\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.648598 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.651030 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6cc7863-b10e-47a4-bd86-5c66436d4af4-ceph\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.652836 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6cc7863-b10e-47a4-bd86-5c66436d4af4-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.680958 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2b9\" (UniqueName: \"kubernetes.io/projected/e6cc7863-b10e-47a4-bd86-5c66436d4af4-kube-api-access-6j2b9\") pod \"cinder-backup-0\" (UID: \"e6cc7863-b10e-47a4-bd86-5c66436d4af4\") " pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.714089 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-78db449746-kg7zl" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.737939 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.738092 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.738188 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.738243 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvw9n\" (UniqueName: \"kubernetes.io/projected/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-kube-api-access-dvw9n\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.738623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.738647 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.741296 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.750549 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.750845 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.754936 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.755837 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.764817 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvw9n\" (UniqueName: \"kubernetes.io/projected/1ba32e4a-2e93-4483-9acf-a7a72792b0f6-kube-api-access-dvw9n\") pod \"cinder-scheduler-0\" (UID: \"1ba32e4a-2e93-4483-9acf-a7a72792b0f6\") " pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.771558 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.784597 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-787b588c76-v5mnn"] Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.785405 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon-log" containerID="cri-o://4ff69006dc4a5c42d13adad171a2a3135ef326af540050a2d29e2347ee5a8552" gracePeriod=30 Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.785844 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" containerID="cri-o://4a0f5cb143bcfa57cbcb5bcafed97ea70a4e3574637e8b06418ee825b8820047" gracePeriod=30 Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.818005 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.828896 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.856654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 03:13:44 crc kubenswrapper[4744]: I0930 03:13:44.866051 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.091417 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-jc7k5"] Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.091967 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" podUID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerName="dnsmasq-dns" containerID="cri-o://c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5" gracePeriod=10 Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.200298 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"597b8dc3-9c8f-48c4-b554-7d8564395142","Type":"ContainerStarted","Data":"04bb15cee4575ca13323889fa3b5a5d0deaac285d73874bf403a26a636c9df34"} Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.200670 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"597b8dc3-9c8f-48c4-b554-7d8564395142","Type":"ContainerStarted","Data":"9c4da126cdc0f3cfd603d6fea71f45f1f6d00f1446a2af74a985e4a3f4158b9a"} Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.229007 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.228991575 podStartE2EDuration="4.228991575s" podCreationTimestamp="2025-09-30 03:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:45.222039439 +0000 UTC m=+1152.395259403" watchObservedRunningTime="2025-09-30 03:13:45.228991575 +0000 UTC m=+1152.402211549" Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.282111 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a1d320da-1463-4d51-beff-da49872cdb35","Type":"ContainerStarted","Data":"1fdf20d59dcabc5def242f74b9216c9b7a5893e1ff7a5971d9d1e16a847dc7aa"} Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.284326 4744 generic.go:334] "Generic (PLEG): container finished" podID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerID="a8cf03771592eff4cb375905c13d671076f830af09c4e80ec6fcced63bc535eb" exitCode=0 Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.285448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd595ddb6-2wvzb" event={"ID":"d8d97540-160a-4b25-9a0a-7ee3c27775f3","Type":"ContainerDied","Data":"a8cf03771592eff4cb375905c13d671076f830af09c4e80ec6fcced63bc535eb"} Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.519431 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7854f055-f44d-4abb-89a1-c11c436fc5bd" path="/var/lib/kubelet/pods/7854f055-f44d-4abb-89a1-c11c436fc5bd/volumes" Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.520220 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1a6e80-d761-4986-b5e1-b6f557bb65b2" path="/var/lib/kubelet/pods/ea1a6e80-d761-4986-b5e1-b6f557bb65b2/volumes" Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.605048 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.839393 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.867842 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.870993 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.987806 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-httpd-config\") pod \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.987864 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s5bm\" (UniqueName: \"kubernetes.io/projected/6ca18f47-7a09-4040-89f2-0b8c3f77a032-kube-api-access-4s5bm\") pod \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.987910 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-ovndb-tls-certs\") pod \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988010 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-svc\") pod \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988043 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-config\") pod \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988085 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-swift-storage-0\") pod \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988112 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-config\") pod \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988139 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-nb\") pod \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988163 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-combined-ca-bundle\") pod \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988206 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-sb\") pod \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\" (UID: \"6ca18f47-7a09-4040-89f2-0b8c3f77a032\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.988288 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd2k7\" (UniqueName: \"kubernetes.io/projected/d8d97540-160a-4b25-9a0a-7ee3c27775f3-kube-api-access-jd2k7\") pod \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\" (UID: \"d8d97540-160a-4b25-9a0a-7ee3c27775f3\") " Sep 30 03:13:45 crc kubenswrapper[4744]: I0930 03:13:45.998996 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d97540-160a-4b25-9a0a-7ee3c27775f3-kube-api-access-jd2k7" (OuterVolumeSpecName: "kube-api-access-jd2k7") pod "d8d97540-160a-4b25-9a0a-7ee3c27775f3" (UID: "d8d97540-160a-4b25-9a0a-7ee3c27775f3"). InnerVolumeSpecName "kube-api-access-jd2k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.007001 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca18f47-7a09-4040-89f2-0b8c3f77a032-kube-api-access-4s5bm" (OuterVolumeSpecName: "kube-api-access-4s5bm") pod "6ca18f47-7a09-4040-89f2-0b8c3f77a032" (UID: "6ca18f47-7a09-4040-89f2-0b8c3f77a032"). InnerVolumeSpecName "kube-api-access-4s5bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.018485 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d8d97540-160a-4b25-9a0a-7ee3c27775f3" (UID: "d8d97540-160a-4b25-9a0a-7ee3c27775f3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.056915 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8d97540-160a-4b25-9a0a-7ee3c27775f3" (UID: "d8d97540-160a-4b25-9a0a-7ee3c27775f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.058235 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-config" (OuterVolumeSpecName: "config") pod "6ca18f47-7a09-4040-89f2-0b8c3f77a032" (UID: "6ca18f47-7a09-4040-89f2-0b8c3f77a032"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.076835 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ca18f47-7a09-4040-89f2-0b8c3f77a032" (UID: "6ca18f47-7a09-4040-89f2-0b8c3f77a032"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.090997 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.091026 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.091036 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.091065 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd2k7\" (UniqueName: \"kubernetes.io/projected/d8d97540-160a-4b25-9a0a-7ee3c27775f3-kube-api-access-jd2k7\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.091073 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.091082 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s5bm\" (UniqueName: \"kubernetes.io/projected/6ca18f47-7a09-4040-89f2-0b8c3f77a032-kube-api-access-4s5bm\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.094963 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ca18f47-7a09-4040-89f2-0b8c3f77a032" (UID: "6ca18f47-7a09-4040-89f2-0b8c3f77a032"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.113859 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ca18f47-7a09-4040-89f2-0b8c3f77a032" (UID: "6ca18f47-7a09-4040-89f2-0b8c3f77a032"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.123063 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-config" (OuterVolumeSpecName: "config") pod "d8d97540-160a-4b25-9a0a-7ee3c27775f3" (UID: "d8d97540-160a-4b25-9a0a-7ee3c27775f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.124032 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ca18f47-7a09-4040-89f2-0b8c3f77a032" (UID: "6ca18f47-7a09-4040-89f2-0b8c3f77a032"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.125501 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d8d97540-160a-4b25-9a0a-7ee3c27775f3" (UID: "d8d97540-160a-4b25-9a0a-7ee3c27775f3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.192505 4744 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.192553 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d8d97540-160a-4b25-9a0a-7ee3c27775f3-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.192563 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.192574 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.192583 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ca18f47-7a09-4040-89f2-0b8c3f77a032-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.296824 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a1d320da-1463-4d51-beff-da49872cdb35","Type":"ContainerStarted","Data":"c6d09cd772d86f55eba94521e60f23f95ab4b21fc6bf1698b996416ec0d2716e"} Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.297596 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.304662 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6cc7863-b10e-47a4-bd86-5c66436d4af4","Type":"ContainerStarted","Data":"c6c3058e2cae3bd66e607c1c02da49809994bbb564979355ba2b9d810f5a32e2"} Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.320249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dd595ddb6-2wvzb" event={"ID":"d8d97540-160a-4b25-9a0a-7ee3c27775f3","Type":"ContainerDied","Data":"1940feb553390b8685ecbc3028c6999d9a3771a96c9277a3c711e558664cb23e"} Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.321247 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dd595ddb6-2wvzb" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.321281 4744 scope.go:117] "RemoveContainer" containerID="8c3941b8702915ec6b162e1e49b1276a4d9ca47032d0c110593402c9b0f7311e" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.326187 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ba32e4a-2e93-4483-9acf-a7a72792b0f6","Type":"ContainerStarted","Data":"4e5d3ca244392d706dea2f039dcf72d00cbe810d371ea021a38c9fe5a781af90"} Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.332296 4744 generic.go:334] "Generic (PLEG): container finished" podID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerID="c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5" exitCode=0 Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.332343 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" event={"ID":"6ca18f47-7a09-4040-89f2-0b8c3f77a032","Type":"ContainerDied","Data":"c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5"} Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.332407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" event={"ID":"6ca18f47-7a09-4040-89f2-0b8c3f77a032","Type":"ContainerDied","Data":"2bcd0da062e8b8d1eb14314c5be3c0c65998871700e8f938ef6a85b0c72ce88a"} Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.332456 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-jc7k5" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.334025 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.334008588 podStartE2EDuration="5.334008588s" podCreationTimestamp="2025-09-30 03:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:46.315016409 +0000 UTC m=+1153.488236383" watchObservedRunningTime="2025-09-30 03:13:46.334008588 +0000 UTC m=+1153.507228562" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.380715 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5dd595ddb6-2wvzb"] Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.395433 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5dd595ddb6-2wvzb"] Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.400104 4744 scope.go:117] "RemoveContainer" containerID="a8cf03771592eff4cb375905c13d671076f830af09c4e80ec6fcced63bc535eb" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.405847 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-jc7k5"] Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.414057 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-jc7k5"] Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.474122 4744 scope.go:117] "RemoveContainer" containerID="c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.539341 4744 scope.go:117] "RemoveContainer" containerID="b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.589920 4744 scope.go:117] "RemoveContainer" containerID="c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5" Sep 30 03:13:46 crc kubenswrapper[4744]: E0930 03:13:46.590857 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5\": container with ID starting with c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5 not found: ID does not exist" containerID="c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.590925 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5"} err="failed to get container status \"c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5\": rpc error: code = NotFound desc = could not find container \"c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5\": container with ID starting with c6e2f099442c277eec71105b28105e3d4d9951aa94b9c31c475679d641e4b0f5 not found: ID does not exist" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.590953 4744 scope.go:117] "RemoveContainer" containerID="b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af" Sep 30 03:13:46 crc kubenswrapper[4744]: E0930 03:13:46.593916 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af\": container with ID starting with b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af not found: ID does not exist" containerID="b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af" Sep 30 03:13:46 crc kubenswrapper[4744]: I0930 03:13:46.593947 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af"} err="failed to get container status \"b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af\": rpc error: code = NotFound desc = could not find container \"b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af\": container with ID starting with b47783b4ab10354bb0690664fc5144cc688786a9e16af8660579f3d45e5c05af not found: ID does not exist" Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.344495 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6cc7863-b10e-47a4-bd86-5c66436d4af4","Type":"ContainerStarted","Data":"19d0544794954bae178d9f0d2921d8cf07d381ef46094be3f36cb652f8aab1f8"} Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.345005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e6cc7863-b10e-47a4-bd86-5c66436d4af4","Type":"ContainerStarted","Data":"b15e0b89c8180233505b0652f859dba6ee66abddf60be5d318d4c6911f69a754"} Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.363030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ba32e4a-2e93-4483-9acf-a7a72792b0f6","Type":"ContainerStarted","Data":"4b36b8e3e532817625bce7529419420147b860b39d42feb239deefbac2c4130e"} Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.363073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1ba32e4a-2e93-4483-9acf-a7a72792b0f6","Type":"ContainerStarted","Data":"a0308589e86a00b4e278ae925154f6277bce7cd956fce91d653d946e993c4c6e"} Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.368239 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.368223156 podStartE2EDuration="3.368223156s" podCreationTimestamp="2025-09-30 03:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:47.365693207 +0000 UTC m=+1154.538913181" watchObservedRunningTime="2025-09-30 03:13:47.368223156 +0000 UTC m=+1154.541443130" Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.383398 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.383379756 podStartE2EDuration="3.383379756s" podCreationTimestamp="2025-09-30 03:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:47.381966392 +0000 UTC m=+1154.555186366" watchObservedRunningTime="2025-09-30 03:13:47.383379756 +0000 UTC m=+1154.556599720" Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.435457 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.517770 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" path="/var/lib/kubelet/pods/6ca18f47-7a09-4040-89f2-0b8c3f77a032/volumes" Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.520759 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" path="/var/lib/kubelet/pods/d8d97540-160a-4b25-9a0a-7ee3c27775f3/volumes" Sep 30 03:13:47 crc kubenswrapper[4744]: I0930 03:13:47.800798 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.195791 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-59ddc4db88-d9q99" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.198835 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:58410->10.217.0.152:8443: read: connection reset by peer" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.374113 4744 generic.go:334] "Generic (PLEG): container finished" podID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerID="4a0f5cb143bcfa57cbcb5bcafed97ea70a4e3574637e8b06418ee825b8820047" exitCode=0 Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.374957 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-787b588c76-v5mnn" event={"ID":"1f214ceb-c91a-4672-8711-9728a3f5e3f3","Type":"ContainerDied","Data":"4a0f5cb143bcfa57cbcb5bcafed97ea70a4e3574637e8b06418ee825b8820047"} Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.724807 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 03:13:48 crc kubenswrapper[4744]: E0930 03:13:48.725453 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-httpd" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.725474 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-httpd" Sep 30 03:13:48 crc kubenswrapper[4744]: E0930 03:13:48.725494 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-api" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.725500 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-api" Sep 30 03:13:48 crc kubenswrapper[4744]: E0930 03:13:48.725515 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerName="dnsmasq-dns" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.725521 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerName="dnsmasq-dns" Sep 30 03:13:48 crc kubenswrapper[4744]: E0930 03:13:48.725533 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerName="init" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.725539 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerName="init" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.725713 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca18f47-7a09-4040-89f2-0b8c3f77a032" containerName="dnsmasq-dns" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.725726 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-httpd" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.725734 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d97540-160a-4b25-9a0a-7ee3c27775f3" containerName="neutron-api" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.726339 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.728616 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-q6qd7" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.728791 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.728903 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.734396 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.746900 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.856773 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952fb37a-2fb5-41f5-a9f6-195c94862274-combined-ca-bundle\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.856857 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/952fb37a-2fb5-41f5-a9f6-195c94862274-openstack-config\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.856928 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7h8\" (UniqueName: \"kubernetes.io/projected/952fb37a-2fb5-41f5-a9f6-195c94862274-kube-api-access-7v7h8\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.856984 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/952fb37a-2fb5-41f5-a9f6-195c94862274-openstack-config-secret\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.958964 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/952fb37a-2fb5-41f5-a9f6-195c94862274-openstack-config\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.959058 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7h8\" (UniqueName: \"kubernetes.io/projected/952fb37a-2fb5-41f5-a9f6-195c94862274-kube-api-access-7v7h8\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.959137 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/952fb37a-2fb5-41f5-a9f6-195c94862274-openstack-config-secret\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.959216 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952fb37a-2fb5-41f5-a9f6-195c94862274-combined-ca-bundle\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.962021 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/952fb37a-2fb5-41f5-a9f6-195c94862274-openstack-config\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.971672 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952fb37a-2fb5-41f5-a9f6-195c94862274-combined-ca-bundle\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.972042 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/952fb37a-2fb5-41f5-a9f6-195c94862274-openstack-config-secret\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:48 crc kubenswrapper[4744]: I0930 03:13:48.982894 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7h8\" (UniqueName: \"kubernetes.io/projected/952fb37a-2fb5-41f5-a9f6-195c94862274-kube-api-access-7v7h8\") pod \"openstackclient\" (UID: \"952fb37a-2fb5-41f5-a9f6-195c94862274\") " pod="openstack/openstackclient" Sep 30 03:13:49 crc kubenswrapper[4744]: I0930 03:13:49.051544 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 03:13:49 crc kubenswrapper[4744]: I0930 03:13:49.829875 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Sep 30 03:13:49 crc kubenswrapper[4744]: I0930 03:13:49.857862 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 03:13:51 crc kubenswrapper[4744]: I0930 03:13:51.593200 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:51 crc kubenswrapper[4744]: I0930 03:13:51.668444 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b554c468b-9gtqj" Sep 30 03:13:51 crc kubenswrapper[4744]: I0930 03:13:51.739361 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6db48c9776-bw4r7"] Sep 30 03:13:51 crc kubenswrapper[4744]: I0930 03:13:51.739658 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6db48c9776-bw4r7" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api-log" containerID="cri-o://9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5" gracePeriod=30 Sep 30 03:13:51 crc kubenswrapper[4744]: I0930 03:13:51.741505 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6db48c9776-bw4r7" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api" containerID="cri-o://6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1" gracePeriod=30 Sep 30 03:13:52 crc kubenswrapper[4744]: W0930 03:13:52.030856 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod952fb37a_2fb5_41f5_a9f6_195c94862274.slice/crio-01ba9f98c08a0819ddda1d918599a673242dd315261e5960d389ef158d78eb77 WatchSource:0}: Error finding container 01ba9f98c08a0819ddda1d918599a673242dd315261e5960d389ef158d78eb77: Status 404 returned error can't find the container with id 01ba9f98c08a0819ddda1d918599a673242dd315261e5960d389ef158d78eb77 Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.033084 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.290900 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-64cfcf86c-tq8s6"] Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.296890 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.300763 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.301019 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.301128 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.311983 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-64cfcf86c-tq8s6"] Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.435545 4744 generic.go:334] "Generic (PLEG): container finished" podID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerID="9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5" exitCode=143 Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.435617 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db48c9776-bw4r7" event={"ID":"4af1755b-5573-4aee-aced-42d4b10bcebc","Type":"ContainerDied","Data":"9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5"} Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.465638 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"952fb37a-2fb5-41f5-a9f6-195c94862274","Type":"ContainerStarted","Data":"01ba9f98c08a0819ddda1d918599a673242dd315261e5960d389ef158d78eb77"} Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.473737 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1486a12d-9554-48ca-899d-1286e1b5913b","Type":"ContainerStarted","Data":"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f"} Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-public-tls-certs\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481464 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtstf\" (UniqueName: \"kubernetes.io/projected/2a42e069-1859-4077-8f50-8b285465b47a-kube-api-access-vtstf\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481503 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-internal-tls-certs\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481563 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a42e069-1859-4077-8f50-8b285465b47a-log-httpd\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481612 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a42e069-1859-4077-8f50-8b285465b47a-run-httpd\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481654 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-combined-ca-bundle\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481703 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2a42e069-1859-4077-8f50-8b285465b47a-etc-swift\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.481727 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-config-data\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.582880 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-public-tls-certs\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.582949 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtstf\" (UniqueName: \"kubernetes.io/projected/2a42e069-1859-4077-8f50-8b285465b47a-kube-api-access-vtstf\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.582973 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-internal-tls-certs\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.583013 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a42e069-1859-4077-8f50-8b285465b47a-log-httpd\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.583038 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a42e069-1859-4077-8f50-8b285465b47a-run-httpd\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.583059 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-combined-ca-bundle\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.583097 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2a42e069-1859-4077-8f50-8b285465b47a-etc-swift\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.583113 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-config-data\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.585324 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a42e069-1859-4077-8f50-8b285465b47a-log-httpd\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.588293 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a42e069-1859-4077-8f50-8b285465b47a-run-httpd\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.591730 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-config-data\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.595188 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-combined-ca-bundle\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.595906 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-internal-tls-certs\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.596358 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a42e069-1859-4077-8f50-8b285465b47a-public-tls-certs\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.617688 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2a42e069-1859-4077-8f50-8b285465b47a-etc-swift\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.630912 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtstf\" (UniqueName: \"kubernetes.io/projected/2a42e069-1859-4077-8f50-8b285465b47a-kube-api-access-vtstf\") pod \"swift-proxy-64cfcf86c-tq8s6\" (UID: \"2a42e069-1859-4077-8f50-8b285465b47a\") " pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.917711 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:52 crc kubenswrapper[4744]: I0930 03:13:52.994930 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Sep 30 03:13:53 crc kubenswrapper[4744]: I0930 03:13:53.487362 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1486a12d-9554-48ca-899d-1286e1b5913b","Type":"ContainerStarted","Data":"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc"} Sep 30 03:13:53 crc kubenswrapper[4744]: I0930 03:13:53.512457 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.791761326 podStartE2EDuration="19.512438484s" podCreationTimestamp="2025-09-30 03:13:34 +0000 UTC" firstStartedPulling="2025-09-30 03:13:35.773411457 +0000 UTC m=+1142.946631431" lastFinishedPulling="2025-09-30 03:13:51.494088615 +0000 UTC m=+1158.667308589" observedRunningTime="2025-09-30 03:13:53.509638847 +0000 UTC m=+1160.682858821" watchObservedRunningTime="2025-09-30 03:13:53.512438484 +0000 UTC m=+1160.685658448" Sep 30 03:13:53 crc kubenswrapper[4744]: I0930 03:13:53.560408 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-64cfcf86c-tq8s6"] Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.504434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64cfcf86c-tq8s6" event={"ID":"2a42e069-1859-4077-8f50-8b285465b47a","Type":"ContainerStarted","Data":"adca4284f3c4749f3f5b9f91082e56ba336c06df6c67ed27d56a86c0da04be86"} Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.504736 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64cfcf86c-tq8s6" event={"ID":"2a42e069-1859-4077-8f50-8b285465b47a","Type":"ContainerStarted","Data":"d748bf6b3165b7d65b1775ca06f51ebdea152c693bab9afaed0156f7c08e1d96"} Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.504810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64cfcf86c-tq8s6" event={"ID":"2a42e069-1859-4077-8f50-8b285465b47a","Type":"ContainerStarted","Data":"af8bec9b3e7f7459ebbb504d5cfbf9c240391dc2d7cdd4921113642cb23a13fb"} Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.504832 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.504846 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.529707 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-64cfcf86c-tq8s6" podStartSLOduration=2.529689465 podStartE2EDuration="2.529689465s" podCreationTimestamp="2025-09-30 03:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:13:54.523550244 +0000 UTC m=+1161.696770218" watchObservedRunningTime="2025-09-30 03:13:54.529689465 +0000 UTC m=+1161.702909439" Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.582150 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.582825 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-central-agent" containerID="cri-o://2698a953a95589cf61a782bfc00177599f663e2ce99b6ac1637c69174697adca" gracePeriod=30 Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.582986 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="proxy-httpd" containerID="cri-o://dbc389ec3e3bf40949ca935842c9ff6ed70c16d1ee56e9a1edc4ae9a27014b5d" gracePeriod=30 Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.583051 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="sg-core" containerID="cri-o://3cad0288c8883e37f9ca31fae7c2654dffef18ae90c1be8acf5229ea3e1c1f0b" gracePeriod=30 Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.583133 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-notification-agent" containerID="cri-o://df817ee6232c687339fd952171095ffd9e3863a87c1ddb3ac1baaac7c0c7012c" gracePeriod=30 Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.594227 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.769987 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.968589 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6db48c9776-bw4r7" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:41602->10.217.0.168:9311: read: connection reset by peer" Sep 30 03:13:54 crc kubenswrapper[4744]: I0930 03:13:54.968888 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6db48c9776-bw4r7" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:41608->10.217.0.168:9311: read: connection reset by peer" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.087574 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.123725 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.423619 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.516821 4744 generic.go:334] "Generic (PLEG): container finished" podID="08c01111-188d-4d73-be12-aac3feab4b02" containerID="dbc389ec3e3bf40949ca935842c9ff6ed70c16d1ee56e9a1edc4ae9a27014b5d" exitCode=0 Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.516856 4744 generic.go:334] "Generic (PLEG): container finished" podID="08c01111-188d-4d73-be12-aac3feab4b02" containerID="3cad0288c8883e37f9ca31fae7c2654dffef18ae90c1be8acf5229ea3e1c1f0b" exitCode=2 Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.516869 4744 generic.go:334] "Generic (PLEG): container finished" podID="08c01111-188d-4d73-be12-aac3feab4b02" containerID="2698a953a95589cf61a782bfc00177599f663e2ce99b6ac1637c69174697adca" exitCode=0 Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.519684 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerDied","Data":"dbc389ec3e3bf40949ca935842c9ff6ed70c16d1ee56e9a1edc4ae9a27014b5d"} Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.519726 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerDied","Data":"3cad0288c8883e37f9ca31fae7c2654dffef18ae90c1be8acf5229ea3e1c1f0b"} Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.519742 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerDied","Data":"2698a953a95589cf61a782bfc00177599f663e2ce99b6ac1637c69174697adca"} Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.531299 4744 generic.go:334] "Generic (PLEG): container finished" podID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerID="6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1" exitCode=0 Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.533843 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db48c9776-bw4r7" event={"ID":"4af1755b-5573-4aee-aced-42d4b10bcebc","Type":"ContainerDied","Data":"6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1"} Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.533914 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db48c9776-bw4r7" event={"ID":"4af1755b-5573-4aee-aced-42d4b10bcebc","Type":"ContainerDied","Data":"b2bf359e1a7aaf1b529ce2a4727738cf98f20ff5f5a45fc85326c64a07f4b5f6"} Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.533938 4744 scope.go:117] "RemoveContainer" containerID="6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.534139 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db48c9776-bw4r7" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.569459 4744 scope.go:117] "RemoveContainer" containerID="9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.574053 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data\") pod \"4af1755b-5573-4aee-aced-42d4b10bcebc\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.575145 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af1755b-5573-4aee-aced-42d4b10bcebc-logs\") pod \"4af1755b-5573-4aee-aced-42d4b10bcebc\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.575229 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-combined-ca-bundle\") pod \"4af1755b-5573-4aee-aced-42d4b10bcebc\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.575253 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data-custom\") pod \"4af1755b-5573-4aee-aced-42d4b10bcebc\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.575292 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqslg\" (UniqueName: \"kubernetes.io/projected/4af1755b-5573-4aee-aced-42d4b10bcebc-kube-api-access-rqslg\") pod \"4af1755b-5573-4aee-aced-42d4b10bcebc\" (UID: \"4af1755b-5573-4aee-aced-42d4b10bcebc\") " Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.577203 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af1755b-5573-4aee-aced-42d4b10bcebc-logs" (OuterVolumeSpecName: "logs") pod "4af1755b-5573-4aee-aced-42d4b10bcebc" (UID: "4af1755b-5573-4aee-aced-42d4b10bcebc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.584719 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af1755b-5573-4aee-aced-42d4b10bcebc-kube-api-access-rqslg" (OuterVolumeSpecName: "kube-api-access-rqslg") pod "4af1755b-5573-4aee-aced-42d4b10bcebc" (UID: "4af1755b-5573-4aee-aced-42d4b10bcebc"). InnerVolumeSpecName "kube-api-access-rqslg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.590191 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4af1755b-5573-4aee-aced-42d4b10bcebc" (UID: "4af1755b-5573-4aee-aced-42d4b10bcebc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.610150 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4af1755b-5573-4aee-aced-42d4b10bcebc" (UID: "4af1755b-5573-4aee-aced-42d4b10bcebc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.647790 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data" (OuterVolumeSpecName: "config-data") pod "4af1755b-5573-4aee-aced-42d4b10bcebc" (UID: "4af1755b-5573-4aee-aced-42d4b10bcebc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.677047 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af1755b-5573-4aee-aced-42d4b10bcebc-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.677088 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.677099 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.677107 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqslg\" (UniqueName: \"kubernetes.io/projected/4af1755b-5573-4aee-aced-42d4b10bcebc-kube-api-access-rqslg\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.677118 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af1755b-5573-4aee-aced-42d4b10bcebc-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.711427 4744 scope.go:117] "RemoveContainer" containerID="6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1" Sep 30 03:13:55 crc kubenswrapper[4744]: E0930 03:13:55.713844 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1\": container with ID starting with 6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1 not found: ID does not exist" containerID="6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.713877 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1"} err="failed to get container status \"6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1\": rpc error: code = NotFound desc = could not find container \"6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1\": container with ID starting with 6527ba3859e0924babdf9f08273ef730381d56554853dc657ec96202abb79aa1 not found: ID does not exist" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.713898 4744 scope.go:117] "RemoveContainer" containerID="9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5" Sep 30 03:13:55 crc kubenswrapper[4744]: E0930 03:13:55.714091 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5\": container with ID starting with 9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5 not found: ID does not exist" containerID="9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.714116 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5"} err="failed to get container status \"9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5\": rpc error: code = NotFound desc = could not find container \"9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5\": container with ID starting with 9987aea64240849575c82ce00e9f427a10c6a822085467fd1546e90fe84689d5 not found: ID does not exist" Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.874046 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6db48c9776-bw4r7"] Sep 30 03:13:55 crc kubenswrapper[4744]: I0930 03:13:55.881062 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6db48c9776-bw4r7"] Sep 30 03:13:56 crc kubenswrapper[4744]: I0930 03:13:56.314529 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Sep 30 03:13:56 crc kubenswrapper[4744]: I0930 03:13:56.373441 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:13:56 crc kubenswrapper[4744]: I0930 03:13:56.541671 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="manila-scheduler" containerID="cri-o://043bd75c3b4314935500113eb155dc3a84be9e5f995dca57714a7262128780e2" gracePeriod=30 Sep 30 03:13:56 crc kubenswrapper[4744]: I0930 03:13:56.542036 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="probe" containerID="cri-o://805d11c22078c31436883518ab8445d3e95bab0a2c52d61452efb4cae229919d" gracePeriod=30 Sep 30 03:13:57 crc kubenswrapper[4744]: I0930 03:13:57.513922 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" path="/var/lib/kubelet/pods/4af1755b-5573-4aee-aced-42d4b10bcebc/volumes" Sep 30 03:13:57 crc kubenswrapper[4744]: I0930 03:13:57.555322 4744 generic.go:334] "Generic (PLEG): container finished" podID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerID="805d11c22078c31436883518ab8445d3e95bab0a2c52d61452efb4cae229919d" exitCode=0 Sep 30 03:13:57 crc kubenswrapper[4744]: I0930 03:13:57.555360 4744 generic.go:334] "Generic (PLEG): container finished" podID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerID="043bd75c3b4314935500113eb155dc3a84be9e5f995dca57714a7262128780e2" exitCode=0 Sep 30 03:13:57 crc kubenswrapper[4744]: I0930 03:13:57.555401 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8410c494-56fd-4498-ad12-e0c6dad119bf","Type":"ContainerDied","Data":"805d11c22078c31436883518ab8445d3e95bab0a2c52d61452efb4cae229919d"} Sep 30 03:13:57 crc kubenswrapper[4744]: I0930 03:13:57.555430 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8410c494-56fd-4498-ad12-e0c6dad119bf","Type":"ContainerDied","Data":"043bd75c3b4314935500113eb155dc3a84be9e5f995dca57714a7262128780e2"} Sep 30 03:13:58 crc kubenswrapper[4744]: I0930 03:13:58.745597 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 30 03:13:59 crc kubenswrapper[4744]: I0930 03:13:59.587607 4744 generic.go:334] "Generic (PLEG): container finished" podID="08c01111-188d-4d73-be12-aac3feab4b02" containerID="df817ee6232c687339fd952171095ffd9e3863a87c1ddb3ac1baaac7c0c7012c" exitCode=0 Sep 30 03:13:59 crc kubenswrapper[4744]: I0930 03:13:59.587698 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerDied","Data":"df817ee6232c687339fd952171095ffd9e3863a87c1ddb3ac1baaac7c0c7012c"} Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.367800 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.429391 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-combined-ca-bundle\") pod \"08c01111-188d-4d73-be12-aac3feab4b02\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.429461 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-log-httpd\") pod \"08c01111-188d-4d73-be12-aac3feab4b02\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.429529 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-run-httpd\") pod \"08c01111-188d-4d73-be12-aac3feab4b02\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.429587 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qc4n\" (UniqueName: \"kubernetes.io/projected/08c01111-188d-4d73-be12-aac3feab4b02-kube-api-access-8qc4n\") pod \"08c01111-188d-4d73-be12-aac3feab4b02\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.429634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-config-data\") pod \"08c01111-188d-4d73-be12-aac3feab4b02\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.429697 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-scripts\") pod \"08c01111-188d-4d73-be12-aac3feab4b02\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.429729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-sg-core-conf-yaml\") pod \"08c01111-188d-4d73-be12-aac3feab4b02\" (UID: \"08c01111-188d-4d73-be12-aac3feab4b02\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.430180 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "08c01111-188d-4d73-be12-aac3feab4b02" (UID: "08c01111-188d-4d73-be12-aac3feab4b02"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.430252 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "08c01111-188d-4d73-be12-aac3feab4b02" (UID: "08c01111-188d-4d73-be12-aac3feab4b02"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.431037 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.431055 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c01111-188d-4d73-be12-aac3feab4b02-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.438336 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c01111-188d-4d73-be12-aac3feab4b02-kube-api-access-8qc4n" (OuterVolumeSpecName: "kube-api-access-8qc4n") pod "08c01111-188d-4d73-be12-aac3feab4b02" (UID: "08c01111-188d-4d73-be12-aac3feab4b02"). InnerVolumeSpecName "kube-api-access-8qc4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.438345 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-scripts" (OuterVolumeSpecName: "scripts") pod "08c01111-188d-4d73-be12-aac3feab4b02" (UID: "08c01111-188d-4d73-be12-aac3feab4b02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.446574 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.457133 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "08c01111-188d-4d73-be12-aac3feab4b02" (UID: "08c01111-188d-4d73-be12-aac3feab4b02"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.526283 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c01111-188d-4d73-be12-aac3feab4b02" (UID: "08c01111-188d-4d73-be12-aac3feab4b02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.532147 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-scripts\") pod \"8410c494-56fd-4498-ad12-e0c6dad119bf\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.532198 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data\") pod \"8410c494-56fd-4498-ad12-e0c6dad119bf\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.532474 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8410c494-56fd-4498-ad12-e0c6dad119bf-etc-machine-id\") pod \"8410c494-56fd-4498-ad12-e0c6dad119bf\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.532541 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data-custom\") pod \"8410c494-56fd-4498-ad12-e0c6dad119bf\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.532645 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-combined-ca-bundle\") pod \"8410c494-56fd-4498-ad12-e0c6dad119bf\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.532685 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hqgz\" (UniqueName: \"kubernetes.io/projected/8410c494-56fd-4498-ad12-e0c6dad119bf-kube-api-access-8hqgz\") pod \"8410c494-56fd-4498-ad12-e0c6dad119bf\" (UID: \"8410c494-56fd-4498-ad12-e0c6dad119bf\") " Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.533835 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qc4n\" (UniqueName: \"kubernetes.io/projected/08c01111-188d-4d73-be12-aac3feab4b02-kube-api-access-8qc4n\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.533871 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.533885 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.533898 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.532637 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8410c494-56fd-4498-ad12-e0c6dad119bf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8410c494-56fd-4498-ad12-e0c6dad119bf" (UID: "8410c494-56fd-4498-ad12-e0c6dad119bf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.538476 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-scripts" (OuterVolumeSpecName: "scripts") pod "8410c494-56fd-4498-ad12-e0c6dad119bf" (UID: "8410c494-56fd-4498-ad12-e0c6dad119bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.538520 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8410c494-56fd-4498-ad12-e0c6dad119bf-kube-api-access-8hqgz" (OuterVolumeSpecName: "kube-api-access-8hqgz") pod "8410c494-56fd-4498-ad12-e0c6dad119bf" (UID: "8410c494-56fd-4498-ad12-e0c6dad119bf"). InnerVolumeSpecName "kube-api-access-8hqgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.538638 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8410c494-56fd-4498-ad12-e0c6dad119bf" (UID: "8410c494-56fd-4498-ad12-e0c6dad119bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.559189 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-config-data" (OuterVolumeSpecName: "config-data") pod "08c01111-188d-4d73-be12-aac3feab4b02" (UID: "08c01111-188d-4d73-be12-aac3feab4b02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.591850 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8410c494-56fd-4498-ad12-e0c6dad119bf" (UID: "8410c494-56fd-4498-ad12-e0c6dad119bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.623351 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"952fb37a-2fb5-41f5-a9f6-195c94862274","Type":"ContainerStarted","Data":"d648e7feed9e293663155a592f735ac9cb5dac8d3623c77c0f481bc9d6125567"} Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.625973 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c01111-188d-4d73-be12-aac3feab4b02","Type":"ContainerDied","Data":"b836fb9ea398818bd93a709539d863d50a982dad436139d298b1b3d65661d7b6"} Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.626025 4744 scope.go:117] "RemoveContainer" containerID="dbc389ec3e3bf40949ca935842c9ff6ed70c16d1ee56e9a1edc4ae9a27014b5d" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.626204 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.629537 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8410c494-56fd-4498-ad12-e0c6dad119bf","Type":"ContainerDied","Data":"d35ad5a933e3b5eb02f0e04ed41bded393636a86b0a75154de7d98ba4babea2a"} Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.629596 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.638738 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.638778 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8410c494-56fd-4498-ad12-e0c6dad119bf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.638791 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.638801 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c01111-188d-4d73-be12-aac3feab4b02-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.638810 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.638821 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hqgz\" (UniqueName: \"kubernetes.io/projected/8410c494-56fd-4498-ad12-e0c6dad119bf-kube-api-access-8hqgz\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.668995 4744 scope.go:117] "RemoveContainer" containerID="3cad0288c8883e37f9ca31fae7c2654dffef18ae90c1be8acf5229ea3e1c1f0b" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.676537 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data" (OuterVolumeSpecName: "config-data") pod "8410c494-56fd-4498-ad12-e0c6dad119bf" (UID: "8410c494-56fd-4498-ad12-e0c6dad119bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.701203 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.687388897 podStartE2EDuration="14.701181703s" podCreationTimestamp="2025-09-30 03:13:48 +0000 UTC" firstStartedPulling="2025-09-30 03:13:52.03317588 +0000 UTC m=+1159.206395854" lastFinishedPulling="2025-09-30 03:14:02.046968686 +0000 UTC m=+1169.220188660" observedRunningTime="2025-09-30 03:14:02.67756707 +0000 UTC m=+1169.850787044" watchObservedRunningTime="2025-09-30 03:14:02.701181703 +0000 UTC m=+1169.874401677" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.702556 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.709286 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.711168 4744 scope.go:117] "RemoveContainer" containerID="df817ee6232c687339fd952171095ffd9e3863a87c1ddb3ac1baaac7c0c7012c" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.720969 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721312 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-notification-agent" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721325 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-notification-agent" Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721338 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-central-agent" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721344 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-central-agent" Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721642 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api-log" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721655 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api-log" Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721666 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="manila-scheduler" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721673 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="manila-scheduler" Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721685 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="probe" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721700 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="probe" Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721715 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721722 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api" Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721743 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="proxy-httpd" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721749 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="proxy-httpd" Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.721756 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="sg-core" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721762 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="sg-core" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721917 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="sg-core" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721929 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-central-agent" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721938 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="ceilometer-notification-agent" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721954 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api-log" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721961 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c01111-188d-4d73-be12-aac3feab4b02" containerName="proxy-httpd" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721971 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="manila-scheduler" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721979 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af1755b-5573-4aee-aced-42d4b10bcebc" containerName="barbican-api" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.721994 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" containerName="probe" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.723633 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.725536 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.725688 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.727176 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.739340 4744 scope.go:117] "RemoveContainer" containerID="2698a953a95589cf61a782bfc00177599f663e2ce99b6ac1637c69174697adca" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.744336 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8410c494-56fd-4498-ad12-e0c6dad119bf-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.779675 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:02 crc kubenswrapper[4744]: E0930 03:14:02.780348 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-5mgt5 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data kube-api-access-5mgt5 log-httpd run-httpd scripts sg-core-conf-yaml]: context canceled" pod="openstack/ceilometer-0" podUID="6a1cfb7d-bc06-423f-8a42-ac9a65f77142" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.809889 4744 scope.go:117] "RemoveContainer" containerID="805d11c22078c31436883518ab8445d3e95bab0a2c52d61452efb4cae229919d" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.839807 4744 scope.go:117] "RemoveContainer" containerID="043bd75c3b4314935500113eb155dc3a84be9e5f995dca57714a7262128780e2" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.845524 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mgt5\" (UniqueName: \"kubernetes.io/projected/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-kube-api-access-5mgt5\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.845600 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-scripts\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.845631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.845674 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.845723 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.845740 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.845760 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-config-data\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.936140 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.937318 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-64cfcf86c-tq8s6" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.947476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mgt5\" (UniqueName: \"kubernetes.io/projected/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-kube-api-access-5mgt5\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.947566 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-scripts\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.947607 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.947666 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.947735 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.947765 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.947789 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-config-data\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.958028 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.964873 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-scripts\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.970696 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.972567 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:02 crc kubenswrapper[4744]: I0930 03:14:02.975033 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.002133 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-config-data\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.022073 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mgt5\" (UniqueName: \"kubernetes.io/projected/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-kube-api-access-5mgt5\") pod \"ceilometer-0\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " pod="openstack/ceilometer-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.073299 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.077390 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.099431 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.100955 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.104721 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.150138 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-config-data\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.150197 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.150240 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-scripts\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.150269 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.150290 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34aed00c-8bca-400a-bea5-1e7966a35388-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.150322 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xmv\" (UniqueName: \"kubernetes.io/projected/34aed00c-8bca-400a-bea5-1e7966a35388-kube-api-access-d8xmv\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.160559 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.252790 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.252863 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-scripts\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.252898 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.252921 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34aed00c-8bca-400a-bea5-1e7966a35388-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.252959 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xmv\" (UniqueName: \"kubernetes.io/projected/34aed00c-8bca-400a-bea5-1e7966a35388-kube-api-access-d8xmv\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.253021 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-config-data\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.253517 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34aed00c-8bca-400a-bea5-1e7966a35388-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.258023 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.259354 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.264890 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-scripts\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.277080 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34aed00c-8bca-400a-bea5-1e7966a35388-config-data\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.286927 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xmv\" (UniqueName: \"kubernetes.io/projected/34aed00c-8bca-400a-bea5-1e7966a35388-kube-api-access-d8xmv\") pod \"manila-scheduler-0\" (UID: \"34aed00c-8bca-400a-bea5-1e7966a35388\") " pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.449748 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.520238 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c01111-188d-4d73-be12-aac3feab4b02" path="/var/lib/kubelet/pods/08c01111-188d-4d73-be12-aac3feab4b02/volumes" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.520932 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8410c494-56fd-4498-ad12-e0c6dad119bf" path="/var/lib/kubelet/pods/8410c494-56fd-4498-ad12-e0c6dad119bf/volumes" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.677241 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.696638 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.762670 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-run-httpd\") pod \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.762776 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-scripts\") pod \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.762804 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-sg-core-conf-yaml\") pod \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.762828 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mgt5\" (UniqueName: \"kubernetes.io/projected/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-kube-api-access-5mgt5\") pod \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.762884 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-config-data\") pod \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.762956 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-combined-ca-bundle\") pod \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.763068 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-log-httpd\") pod \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\" (UID: \"6a1cfb7d-bc06-423f-8a42-ac9a65f77142\") " Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.763513 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a1cfb7d-bc06-423f-8a42-ac9a65f77142" (UID: "6a1cfb7d-bc06-423f-8a42-ac9a65f77142"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.763638 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.764266 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a1cfb7d-bc06-423f-8a42-ac9a65f77142" (UID: "6a1cfb7d-bc06-423f-8a42-ac9a65f77142"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.768053 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1cfb7d-bc06-423f-8a42-ac9a65f77142" (UID: "6a1cfb7d-bc06-423f-8a42-ac9a65f77142"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.768083 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-config-data" (OuterVolumeSpecName: "config-data") pod "6a1cfb7d-bc06-423f-8a42-ac9a65f77142" (UID: "6a1cfb7d-bc06-423f-8a42-ac9a65f77142"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.768383 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-scripts" (OuterVolumeSpecName: "scripts") pod "6a1cfb7d-bc06-423f-8a42-ac9a65f77142" (UID: "6a1cfb7d-bc06-423f-8a42-ac9a65f77142"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.768891 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a1cfb7d-bc06-423f-8a42-ac9a65f77142" (UID: "6a1cfb7d-bc06-423f-8a42-ac9a65f77142"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.770304 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-kube-api-access-5mgt5" (OuterVolumeSpecName: "kube-api-access-5mgt5") pod "6a1cfb7d-bc06-423f-8a42-ac9a65f77142" (UID: "6a1cfb7d-bc06-423f-8a42-ac9a65f77142"). InnerVolumeSpecName "kube-api-access-5mgt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.865244 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.865286 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.865298 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.865310 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mgt5\" (UniqueName: \"kubernetes.io/projected/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-kube-api-access-5mgt5\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.865336 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.865348 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1cfb7d-bc06-423f-8a42-ac9a65f77142-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:03 crc kubenswrapper[4744]: I0930 03:14:03.884075 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Sep 30 03:14:03 crc kubenswrapper[4744]: W0930 03:14:03.889175 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34aed00c_8bca_400a_bea5_1e7966a35388.slice/crio-58e440268ad643703bce9b6f0d918d847e1d4ab617aa8d40230585f8a6a21a98 WatchSource:0}: Error finding container 58e440268ad643703bce9b6f0d918d847e1d4ab617aa8d40230585f8a6a21a98: Status 404 returned error can't find the container with id 58e440268ad643703bce9b6f0d918d847e1d4ab617aa8d40230585f8a6a21a98 Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.068255 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.690863 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.692420 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"34aed00c-8bca-400a-bea5-1e7966a35388","Type":"ContainerStarted","Data":"e4495b03f86fea5e8df0a7be6126a8f427010b5dc4b1bb2ceaa86c84e6d83331"} Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.692457 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"34aed00c-8bca-400a-bea5-1e7966a35388","Type":"ContainerStarted","Data":"58e440268ad643703bce9b6f0d918d847e1d4ab617aa8d40230585f8a6a21a98"} Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.748428 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.755599 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.770345 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.774965 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.778144 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.779083 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.787906 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.895695 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.895845 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-config-data\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.895898 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.895952 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-scripts\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.896034 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-run-httpd\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.896106 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjts9\" (UniqueName: \"kubernetes.io/projected/59761566-208e-47fa-b4b1-987db294ba92-kube-api-access-zjts9\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.896173 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-log-httpd\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.997430 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.997494 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-config-data\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.997516 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.997540 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-scripts\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.997581 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-run-httpd\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.997635 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjts9\" (UniqueName: \"kubernetes.io/projected/59761566-208e-47fa-b4b1-987db294ba92-kube-api-access-zjts9\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.997700 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-log-httpd\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.998086 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-run-httpd\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:04 crc kubenswrapper[4744]: I0930 03:14:04.998134 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-log-httpd\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.002751 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.003628 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-config-data\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.004597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-scripts\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.005650 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.018122 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjts9\" (UniqueName: \"kubernetes.io/projected/59761566-208e-47fa-b4b1-987db294ba92-kube-api-access-zjts9\") pod \"ceilometer-0\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " pod="openstack/ceilometer-0" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.111929 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.326904 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.512711 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1cfb7d-bc06-423f-8a42-ac9a65f77142" path="/var/lib/kubelet/pods/6a1cfb7d-bc06-423f-8a42-ac9a65f77142/volumes" Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.587172 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.701573 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerStarted","Data":"1245416fff7ceb4ec075eedd418b200871f786568d6dab3ca588302cd5ddb19b"} Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.703326 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"34aed00c-8bca-400a-bea5-1e7966a35388","Type":"ContainerStarted","Data":"effeb9f88b78651df8edee987854fa4ffcd77b19e341613ac897f645ff25d27f"} Sep 30 03:14:05 crc kubenswrapper[4744]: I0930 03:14:05.726688 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.7266667509999998 podStartE2EDuration="2.726666751s" podCreationTimestamp="2025-09-30 03:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:14:05.721153501 +0000 UTC m=+1172.894373475" watchObservedRunningTime="2025-09-30 03:14:05.726666751 +0000 UTC m=+1172.899886735" Sep 30 03:14:06 crc kubenswrapper[4744]: I0930 03:14:06.368955 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Sep 30 03:14:06 crc kubenswrapper[4744]: I0930 03:14:06.472106 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:14:06 crc kubenswrapper[4744]: I0930 03:14:06.714690 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerStarted","Data":"1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096"} Sep 30 03:14:06 crc kubenswrapper[4744]: I0930 03:14:06.716432 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="manila-share" containerID="cri-o://9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f" gracePeriod=30 Sep 30 03:14:06 crc kubenswrapper[4744]: I0930 03:14:06.716461 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="probe" containerID="cri-o://aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc" gracePeriod=30 Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.690574 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.723744 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerStarted","Data":"f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414"} Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.725200 4744 generic.go:334] "Generic (PLEG): container finished" podID="1486a12d-9554-48ca-899d-1286e1b5913b" containerID="aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc" exitCode=0 Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.725233 4744 generic.go:334] "Generic (PLEG): container finished" podID="1486a12d-9554-48ca-899d-1286e1b5913b" containerID="9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f" exitCode=1 Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.725257 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1486a12d-9554-48ca-899d-1286e1b5913b","Type":"ContainerDied","Data":"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc"} Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.725283 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.725302 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1486a12d-9554-48ca-899d-1286e1b5913b","Type":"ContainerDied","Data":"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f"} Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.725317 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1486a12d-9554-48ca-899d-1286e1b5913b","Type":"ContainerDied","Data":"ca7b83a4f6adefdcf036ab83869b72faff10f33aaf493403988d25b6e9204f82"} Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.725338 4744 scope.go:117] "RemoveContainer" containerID="aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.744885 4744 scope.go:117] "RemoveContainer" containerID="9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.769683 4744 scope.go:117] "RemoveContainer" containerID="aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc" Sep 30 03:14:07 crc kubenswrapper[4744]: E0930 03:14:07.770249 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc\": container with ID starting with aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc not found: ID does not exist" containerID="aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.770291 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc"} err="failed to get container status \"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc\": rpc error: code = NotFound desc = could not find container \"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc\": container with ID starting with aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc not found: ID does not exist" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.770320 4744 scope.go:117] "RemoveContainer" containerID="9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f" Sep 30 03:14:07 crc kubenswrapper[4744]: E0930 03:14:07.770663 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f\": container with ID starting with 9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f not found: ID does not exist" containerID="9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.770703 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f"} err="failed to get container status \"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f\": rpc error: code = NotFound desc = could not find container \"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f\": container with ID starting with 9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f not found: ID does not exist" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.770728 4744 scope.go:117] "RemoveContainer" containerID="aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.771258 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc"} err="failed to get container status \"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc\": rpc error: code = NotFound desc = could not find container \"aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc\": container with ID starting with aa0015ad4bd70d94c1704bb349b53400a458ceced9bf5752ae9d7321ae3fb3bc not found: ID does not exist" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.771280 4744 scope.go:117] "RemoveContainer" containerID="9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.771654 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f"} err="failed to get container status \"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f\": rpc error: code = NotFound desc = could not find container \"9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f\": container with ID starting with 9ee984236ae17dcfa7bf4bd740d20cbe90928061048d43fe5cd1b570f7a4cd0f not found: ID does not exist" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848340 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-combined-ca-bundle\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848407 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data-custom\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848435 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-var-lib-manila\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848473 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-scripts\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848527 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848604 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-ceph\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848619 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-etc-machine-id\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.848651 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrw2s\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-kube-api-access-rrw2s\") pod \"1486a12d-9554-48ca-899d-1286e1b5913b\" (UID: \"1486a12d-9554-48ca-899d-1286e1b5913b\") " Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.849406 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.849449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.853414 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-scripts" (OuterVolumeSpecName: "scripts") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.853674 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.853710 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-kube-api-access-rrw2s" (OuterVolumeSpecName: "kube-api-access-rrw2s") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "kube-api-access-rrw2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.856112 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-ceph" (OuterVolumeSpecName: "ceph") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.913497 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.951082 4744 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.951107 4744 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.951117 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrw2s\" (UniqueName: \"kubernetes.io/projected/1486a12d-9554-48ca-899d-1286e1b5913b-kube-api-access-rrw2s\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.951126 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.951134 4744 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.951144 4744 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1486a12d-9554-48ca-899d-1286e1b5913b-var-lib-manila\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.951154 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:07 crc kubenswrapper[4744]: I0930 03:14:07.976526 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data" (OuterVolumeSpecName: "config-data") pod "1486a12d-9554-48ca-899d-1286e1b5913b" (UID: "1486a12d-9554-48ca-899d-1286e1b5913b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.052642 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1486a12d-9554-48ca-899d-1286e1b5913b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.060274 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.072265 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.089752 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:14:08 crc kubenswrapper[4744]: E0930 03:14:08.090184 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="manila-share" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.090210 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="manila-share" Sep 30 03:14:08 crc kubenswrapper[4744]: E0930 03:14:08.090260 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="probe" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.090272 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="probe" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.090505 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="manila-share" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.090543 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" containerName="probe" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.091686 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.097533 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.106132 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.153500 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/8f4d1853-2bfc-4470-be87-65c81ff45b97-kube-api-access-6bmd9\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.153554 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8f4d1853-2bfc-4470-be87-65c81ff45b97-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.153699 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.153988 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.154078 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f4d1853-2bfc-4470-be87-65c81ff45b97-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.154102 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f4d1853-2bfc-4470-be87-65c81ff45b97-ceph\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.154172 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-scripts\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.154208 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-config-data\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256522 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256613 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f4d1853-2bfc-4470-be87-65c81ff45b97-ceph\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256652 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f4d1853-2bfc-4470-be87-65c81ff45b97-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-scripts\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256742 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-config-data\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256776 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/8f4d1853-2bfc-4470-be87-65c81ff45b97-kube-api-access-6bmd9\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256810 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8f4d1853-2bfc-4470-be87-65c81ff45b97-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256813 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f4d1853-2bfc-4470-be87-65c81ff45b97-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.256883 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.257291 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8f4d1853-2bfc-4470-be87-65c81ff45b97-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.262597 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.264553 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-config-data\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.266179 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-scripts\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.266465 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8f4d1853-2bfc-4470-be87-65c81ff45b97-ceph\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.274435 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f4d1853-2bfc-4470-be87-65c81ff45b97-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.281444 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmd9\" (UniqueName: \"kubernetes.io/projected/8f4d1853-2bfc-4470-be87-65c81ff45b97-kube-api-access-6bmd9\") pod \"manila-share-share1-0\" (UID: \"8f4d1853-2bfc-4470-be87-65c81ff45b97\") " pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.453803 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.738116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerStarted","Data":"c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0"} Sep 30 03:14:08 crc kubenswrapper[4744]: I0930 03:14:08.744725 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-787b588c76-v5mnn" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.013216 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.513934 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1486a12d-9554-48ca-899d-1286e1b5913b" path="/var/lib/kubelet/pods/1486a12d-9554-48ca-899d-1286e1b5913b/volumes" Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.785203 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7cwff"] Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.787387 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7cwff" Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.788896 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerStarted","Data":"ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2"} Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.788923 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-central-agent" containerID="cri-o://1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096" gracePeriod=30 Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.788966 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.789002 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="proxy-httpd" containerID="cri-o://ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2" gracePeriod=30 Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.788998 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-notification-agent" containerID="cri-o://f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414" gracePeriod=30 Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.789032 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="sg-core" containerID="cri-o://c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0" gracePeriod=30 Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.798486 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7cwff"] Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.825618 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8f4d1853-2bfc-4470-be87-65c81ff45b97","Type":"ContainerStarted","Data":"ddbdccf7ca399e3fd281c00ac79ab69cb5623b9cbc9516824f80819c4e0048ea"} Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.825673 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8f4d1853-2bfc-4470-be87-65c81ff45b97","Type":"ContainerStarted","Data":"2b76bbdfa0333867206933e0f83dc7809cebb2b8f78a47dcfa51026b43df9597"} Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.890576 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zgjn\" (UniqueName: \"kubernetes.io/projected/88f73e88-9ffe-4d46-9869-9e3b22e2054e-kube-api-access-7zgjn\") pod \"nova-api-db-create-7cwff\" (UID: \"88f73e88-9ffe-4d46-9869-9e3b22e2054e\") " pod="openstack/nova-api-db-create-7cwff" Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.941182 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-brwgn"] Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.943047 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brwgn" Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.948445 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.315071998 podStartE2EDuration="5.948423876s" podCreationTimestamp="2025-09-30 03:14:04 +0000 UTC" firstStartedPulling="2025-09-30 03:14:05.591737185 +0000 UTC m=+1172.764957159" lastFinishedPulling="2025-09-30 03:14:09.225089063 +0000 UTC m=+1176.398309037" observedRunningTime="2025-09-30 03:14:09.89218682 +0000 UTC m=+1177.065406794" watchObservedRunningTime="2025-09-30 03:14:09.948423876 +0000 UTC m=+1177.121643840" Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.990438 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-brwgn"] Sep 30 03:14:09 crc kubenswrapper[4744]: I0930 03:14:09.993571 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zgjn\" (UniqueName: \"kubernetes.io/projected/88f73e88-9ffe-4d46-9869-9e3b22e2054e-kube-api-access-7zgjn\") pod \"nova-api-db-create-7cwff\" (UID: \"88f73e88-9ffe-4d46-9869-9e3b22e2054e\") " pod="openstack/nova-api-db-create-7cwff" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.018807 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zgjn\" (UniqueName: \"kubernetes.io/projected/88f73e88-9ffe-4d46-9869-9e3b22e2054e-kube-api-access-7zgjn\") pod \"nova-api-db-create-7cwff\" (UID: \"88f73e88-9ffe-4d46-9869-9e3b22e2054e\") " pod="openstack/nova-api-db-create-7cwff" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.070610 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-swpzg"] Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.072249 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-swpzg" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.080549 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-swpzg"] Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.095490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllnx\" (UniqueName: \"kubernetes.io/projected/435ec201-6b70-44c7-bafa-9e203803ed2b-kube-api-access-zllnx\") pod \"nova-cell0-db-create-brwgn\" (UID: \"435ec201-6b70-44c7-bafa-9e203803ed2b\") " pod="openstack/nova-cell0-db-create-brwgn" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.108749 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7cwff" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.198444 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllnx\" (UniqueName: \"kubernetes.io/projected/435ec201-6b70-44c7-bafa-9e203803ed2b-kube-api-access-zllnx\") pod \"nova-cell0-db-create-brwgn\" (UID: \"435ec201-6b70-44c7-bafa-9e203803ed2b\") " pod="openstack/nova-cell0-db-create-brwgn" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.198760 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjf5\" (UniqueName: \"kubernetes.io/projected/292dafaf-ce57-46a1-8430-7d4f7baf831c-kube-api-access-7cjf5\") pod \"nova-cell1-db-create-swpzg\" (UID: \"292dafaf-ce57-46a1-8430-7d4f7baf831c\") " pod="openstack/nova-cell1-db-create-swpzg" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.221719 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllnx\" (UniqueName: \"kubernetes.io/projected/435ec201-6b70-44c7-bafa-9e203803ed2b-kube-api-access-zllnx\") pod \"nova-cell0-db-create-brwgn\" (UID: \"435ec201-6b70-44c7-bafa-9e203803ed2b\") " pod="openstack/nova-cell0-db-create-brwgn" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.300641 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjf5\" (UniqueName: \"kubernetes.io/projected/292dafaf-ce57-46a1-8430-7d4f7baf831c-kube-api-access-7cjf5\") pod \"nova-cell1-db-create-swpzg\" (UID: \"292dafaf-ce57-46a1-8430-7d4f7baf831c\") " pod="openstack/nova-cell1-db-create-swpzg" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.302796 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brwgn" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.328237 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjf5\" (UniqueName: \"kubernetes.io/projected/292dafaf-ce57-46a1-8430-7d4f7baf831c-kube-api-access-7cjf5\") pod \"nova-cell1-db-create-swpzg\" (UID: \"292dafaf-ce57-46a1-8430-7d4f7baf831c\") " pod="openstack/nova-cell1-db-create-swpzg" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.570268 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-swpzg" Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.629509 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7cwff"] Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.801087 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-brwgn"] Sep 30 03:14:10 crc kubenswrapper[4744]: W0930 03:14:10.804526 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod435ec201_6b70_44c7_bafa_9e203803ed2b.slice/crio-80649edc0bdf10b2161a7f20993de276f365dd86d440247a4cfb98722382814a WatchSource:0}: Error finding container 80649edc0bdf10b2161a7f20993de276f365dd86d440247a4cfb98722382814a: Status 404 returned error can't find the container with id 80649edc0bdf10b2161a7f20993de276f365dd86d440247a4cfb98722382814a Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.861049 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7cwff" event={"ID":"88f73e88-9ffe-4d46-9869-9e3b22e2054e","Type":"ContainerStarted","Data":"89f20fc28ba5ae7d0b0f88bf67963ca3450a0ade0b5d232d39a09847c5ff828e"} Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.864560 4744 generic.go:334] "Generic (PLEG): container finished" podID="59761566-208e-47fa-b4b1-987db294ba92" containerID="ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2" exitCode=0 Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.864588 4744 generic.go:334] "Generic (PLEG): container finished" podID="59761566-208e-47fa-b4b1-987db294ba92" containerID="c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0" exitCode=2 Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.864597 4744 generic.go:334] "Generic (PLEG): container finished" podID="59761566-208e-47fa-b4b1-987db294ba92" containerID="f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414" exitCode=0 Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.864631 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerDied","Data":"ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2"} Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.864659 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerDied","Data":"c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0"} Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.864669 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerDied","Data":"f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414"} Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.867019 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8f4d1853-2bfc-4470-be87-65c81ff45b97","Type":"ContainerStarted","Data":"2a7c8baeef50925f43ff46497375cfe0819bbfc0e0c1cf7b8215deb378a6487e"} Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.867886 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-brwgn" event={"ID":"435ec201-6b70-44c7-bafa-9e203803ed2b","Type":"ContainerStarted","Data":"80649edc0bdf10b2161a7f20993de276f365dd86d440247a4cfb98722382814a"} Sep 30 03:14:10 crc kubenswrapper[4744]: I0930 03:14:10.887694 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.887677586 podStartE2EDuration="2.887677586s" podCreationTimestamp="2025-09-30 03:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:14:10.88585976 +0000 UTC m=+1178.059079734" watchObservedRunningTime="2025-09-30 03:14:10.887677586 +0000 UTC m=+1178.060897560" Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.030679 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-swpzg"] Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.881559 4744 generic.go:334] "Generic (PLEG): container finished" podID="292dafaf-ce57-46a1-8430-7d4f7baf831c" containerID="eb8870a882743cbec0d66513f609ccacc5d94b1a910a483be2779f2102f8c675" exitCode=0 Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.881617 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-swpzg" event={"ID":"292dafaf-ce57-46a1-8430-7d4f7baf831c","Type":"ContainerDied","Data":"eb8870a882743cbec0d66513f609ccacc5d94b1a910a483be2779f2102f8c675"} Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.881970 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-swpzg" event={"ID":"292dafaf-ce57-46a1-8430-7d4f7baf831c","Type":"ContainerStarted","Data":"efc46143186642170eedf89d8a47596ef519aec8c924d117349fe2eadc230e34"} Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.883289 4744 generic.go:334] "Generic (PLEG): container finished" podID="88f73e88-9ffe-4d46-9869-9e3b22e2054e" containerID="bca3bced2322eef2371d5365d9db01ab72bd9ef9ea1799eda234c24c262ddc2f" exitCode=0 Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.883331 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7cwff" event={"ID":"88f73e88-9ffe-4d46-9869-9e3b22e2054e","Type":"ContainerDied","Data":"bca3bced2322eef2371d5365d9db01ab72bd9ef9ea1799eda234c24c262ddc2f"} Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.884914 4744 generic.go:334] "Generic (PLEG): container finished" podID="435ec201-6b70-44c7-bafa-9e203803ed2b" containerID="356de144e3b104e081ed63954db6371fab98cc4d2223df6981a4a9a4f20b3176" exitCode=0 Sep 30 03:14:11 crc kubenswrapper[4744]: I0930 03:14:11.885399 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-brwgn" event={"ID":"435ec201-6b70-44c7-bafa-9e203803ed2b","Type":"ContainerDied","Data":"356de144e3b104e081ed63954db6371fab98cc4d2223df6981a4a9a4f20b3176"} Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.451830 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.461944 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brwgn" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.580196 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zllnx\" (UniqueName: \"kubernetes.io/projected/435ec201-6b70-44c7-bafa-9e203803ed2b-kube-api-access-zllnx\") pod \"435ec201-6b70-44c7-bafa-9e203803ed2b\" (UID: \"435ec201-6b70-44c7-bafa-9e203803ed2b\") " Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.587633 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435ec201-6b70-44c7-bafa-9e203803ed2b-kube-api-access-zllnx" (OuterVolumeSpecName: "kube-api-access-zllnx") pod "435ec201-6b70-44c7-bafa-9e203803ed2b" (UID: "435ec201-6b70-44c7-bafa-9e203803ed2b"). InnerVolumeSpecName "kube-api-access-zllnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.658821 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-swpzg" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.659568 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7cwff" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.683386 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zllnx\" (UniqueName: \"kubernetes.io/projected/435ec201-6b70-44c7-bafa-9e203803ed2b-kube-api-access-zllnx\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.784110 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cjf5\" (UniqueName: \"kubernetes.io/projected/292dafaf-ce57-46a1-8430-7d4f7baf831c-kube-api-access-7cjf5\") pod \"292dafaf-ce57-46a1-8430-7d4f7baf831c\" (UID: \"292dafaf-ce57-46a1-8430-7d4f7baf831c\") " Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.784172 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zgjn\" (UniqueName: \"kubernetes.io/projected/88f73e88-9ffe-4d46-9869-9e3b22e2054e-kube-api-access-7zgjn\") pod \"88f73e88-9ffe-4d46-9869-9e3b22e2054e\" (UID: \"88f73e88-9ffe-4d46-9869-9e3b22e2054e\") " Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.790605 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292dafaf-ce57-46a1-8430-7d4f7baf831c-kube-api-access-7cjf5" (OuterVolumeSpecName: "kube-api-access-7cjf5") pod "292dafaf-ce57-46a1-8430-7d4f7baf831c" (UID: "292dafaf-ce57-46a1-8430-7d4f7baf831c"). InnerVolumeSpecName "kube-api-access-7cjf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.790664 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f73e88-9ffe-4d46-9869-9e3b22e2054e-kube-api-access-7zgjn" (OuterVolumeSpecName: "kube-api-access-7zgjn") pod "88f73e88-9ffe-4d46-9869-9e3b22e2054e" (UID: "88f73e88-9ffe-4d46-9869-9e3b22e2054e"). InnerVolumeSpecName "kube-api-access-7zgjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.887146 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cjf5\" (UniqueName: \"kubernetes.io/projected/292dafaf-ce57-46a1-8430-7d4f7baf831c-kube-api-access-7cjf5\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.887473 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zgjn\" (UniqueName: \"kubernetes.io/projected/88f73e88-9ffe-4d46-9869-9e3b22e2054e-kube-api-access-7zgjn\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.903442 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-swpzg" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.903455 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-swpzg" event={"ID":"292dafaf-ce57-46a1-8430-7d4f7baf831c","Type":"ContainerDied","Data":"efc46143186642170eedf89d8a47596ef519aec8c924d117349fe2eadc230e34"} Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.903490 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc46143186642170eedf89d8a47596ef519aec8c924d117349fe2eadc230e34" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.905327 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7cwff" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.905320 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7cwff" event={"ID":"88f73e88-9ffe-4d46-9869-9e3b22e2054e","Type":"ContainerDied","Data":"89f20fc28ba5ae7d0b0f88bf67963ca3450a0ade0b5d232d39a09847c5ff828e"} Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.905567 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89f20fc28ba5ae7d0b0f88bf67963ca3450a0ade0b5d232d39a09847c5ff828e" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.907039 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-brwgn" event={"ID":"435ec201-6b70-44c7-bafa-9e203803ed2b","Type":"ContainerDied","Data":"80649edc0bdf10b2161a7f20993de276f365dd86d440247a4cfb98722382814a"} Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.907073 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80649edc0bdf10b2161a7f20993de276f365dd86d440247a4cfb98722382814a" Sep 30 03:14:13 crc kubenswrapper[4744]: I0930 03:14:13.907083 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-brwgn" Sep 30 03:14:14 crc kubenswrapper[4744]: I0930 03:14:14.916115 4744 generic.go:334] "Generic (PLEG): container finished" podID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerID="4ff69006dc4a5c42d13adad171a2a3135ef326af540050a2d29e2347ee5a8552" exitCode=137 Sep 30 03:14:14 crc kubenswrapper[4744]: I0930 03:14:14.916191 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-787b588c76-v5mnn" event={"ID":"1f214ceb-c91a-4672-8711-9728a3f5e3f3","Type":"ContainerDied","Data":"4ff69006dc4a5c42d13adad171a2a3135ef326af540050a2d29e2347ee5a8552"} Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.214562 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.307362 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f214ceb-c91a-4672-8711-9728a3f5e3f3-logs\") pod \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.307432 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lr98\" (UniqueName: \"kubernetes.io/projected/1f214ceb-c91a-4672-8711-9728a3f5e3f3-kube-api-access-6lr98\") pod \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.307472 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-tls-certs\") pod \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.307510 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-config-data\") pod \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.307553 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-combined-ca-bundle\") pod \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.307617 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-secret-key\") pod \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.307729 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-scripts\") pod \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\" (UID: \"1f214ceb-c91a-4672-8711-9728a3f5e3f3\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.308043 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f214ceb-c91a-4672-8711-9728a3f5e3f3-logs" (OuterVolumeSpecName: "logs") pod "1f214ceb-c91a-4672-8711-9728a3f5e3f3" (UID: "1f214ceb-c91a-4672-8711-9728a3f5e3f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.308239 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f214ceb-c91a-4672-8711-9728a3f5e3f3-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.327234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1f214ceb-c91a-4672-8711-9728a3f5e3f3" (UID: "1f214ceb-c91a-4672-8711-9728a3f5e3f3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.329571 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f214ceb-c91a-4672-8711-9728a3f5e3f3-kube-api-access-6lr98" (OuterVolumeSpecName: "kube-api-access-6lr98") pod "1f214ceb-c91a-4672-8711-9728a3f5e3f3" (UID: "1f214ceb-c91a-4672-8711-9728a3f5e3f3"). InnerVolumeSpecName "kube-api-access-6lr98". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.334139 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-config-data" (OuterVolumeSpecName: "config-data") pod "1f214ceb-c91a-4672-8711-9728a3f5e3f3" (UID: "1f214ceb-c91a-4672-8711-9728a3f5e3f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.355530 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f214ceb-c91a-4672-8711-9728a3f5e3f3" (UID: "1f214ceb-c91a-4672-8711-9728a3f5e3f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.361733 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-scripts" (OuterVolumeSpecName: "scripts") pod "1f214ceb-c91a-4672-8711-9728a3f5e3f3" (UID: "1f214ceb-c91a-4672-8711-9728a3f5e3f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.374564 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1f214ceb-c91a-4672-8711-9728a3f5e3f3" (UID: "1f214ceb-c91a-4672-8711-9728a3f5e3f3"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.409978 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.410010 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lr98\" (UniqueName: \"kubernetes.io/projected/1f214ceb-c91a-4672-8711-9728a3f5e3f3-kube-api-access-6lr98\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.410021 4744 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.410030 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f214ceb-c91a-4672-8711-9728a3f5e3f3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.410040 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.410049 4744 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f214ceb-c91a-4672-8711-9728a3f5e3f3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.852867 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.917026 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-config-data\") pod \"59761566-208e-47fa-b4b1-987db294ba92\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.917138 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-log-httpd\") pod \"59761566-208e-47fa-b4b1-987db294ba92\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.917215 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-sg-core-conf-yaml\") pod \"59761566-208e-47fa-b4b1-987db294ba92\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.917330 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjts9\" (UniqueName: \"kubernetes.io/projected/59761566-208e-47fa-b4b1-987db294ba92-kube-api-access-zjts9\") pod \"59761566-208e-47fa-b4b1-987db294ba92\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.917405 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-scripts\") pod \"59761566-208e-47fa-b4b1-987db294ba92\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.917508 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-combined-ca-bundle\") pod \"59761566-208e-47fa-b4b1-987db294ba92\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.917535 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-run-httpd\") pod \"59761566-208e-47fa-b4b1-987db294ba92\" (UID: \"59761566-208e-47fa-b4b1-987db294ba92\") " Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.918405 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "59761566-208e-47fa-b4b1-987db294ba92" (UID: "59761566-208e-47fa-b4b1-987db294ba92"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.922111 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "59761566-208e-47fa-b4b1-987db294ba92" (UID: "59761566-208e-47fa-b4b1-987db294ba92"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.926693 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-scripts" (OuterVolumeSpecName: "scripts") pod "59761566-208e-47fa-b4b1-987db294ba92" (UID: "59761566-208e-47fa-b4b1-987db294ba92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.929544 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59761566-208e-47fa-b4b1-987db294ba92-kube-api-access-zjts9" (OuterVolumeSpecName: "kube-api-access-zjts9") pod "59761566-208e-47fa-b4b1-987db294ba92" (UID: "59761566-208e-47fa-b4b1-987db294ba92"). InnerVolumeSpecName "kube-api-access-zjts9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.935329 4744 generic.go:334] "Generic (PLEG): container finished" podID="59761566-208e-47fa-b4b1-987db294ba92" containerID="1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096" exitCode=0 Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.935402 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.935417 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerDied","Data":"1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096"} Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.936564 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"59761566-208e-47fa-b4b1-987db294ba92","Type":"ContainerDied","Data":"1245416fff7ceb4ec075eedd418b200871f786568d6dab3ca588302cd5ddb19b"} Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.936677 4744 scope.go:117] "RemoveContainer" containerID="ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.942892 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-787b588c76-v5mnn" event={"ID":"1f214ceb-c91a-4672-8711-9728a3f5e3f3","Type":"ContainerDied","Data":"c6afb023f98893d5e3a79976ecc11f66ceb9a1dedcb89d27e61f309090ccfaac"} Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.943387 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-787b588c76-v5mnn" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.966361 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "59761566-208e-47fa-b4b1-987db294ba92" (UID: "59761566-208e-47fa-b4b1-987db294ba92"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.972258 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-787b588c76-v5mnn"] Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.986012 4744 scope.go:117] "RemoveContainer" containerID="c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0" Sep 30 03:14:15 crc kubenswrapper[4744]: I0930 03:14:15.986129 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-787b588c76-v5mnn"] Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.005390 4744 scope.go:117] "RemoveContainer" containerID="f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.010155 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59761566-208e-47fa-b4b1-987db294ba92" (UID: "59761566-208e-47fa-b4b1-987db294ba92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.019641 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjts9\" (UniqueName: \"kubernetes.io/projected/59761566-208e-47fa-b4b1-987db294ba92-kube-api-access-zjts9\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.019669 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.019679 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.019688 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.019703 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59761566-208e-47fa-b4b1-987db294ba92-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.019712 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.024913 4744 scope.go:117] "RemoveContainer" containerID="1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.041282 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-config-data" (OuterVolumeSpecName: "config-data") pod "59761566-208e-47fa-b4b1-987db294ba92" (UID: "59761566-208e-47fa-b4b1-987db294ba92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.046543 4744 scope.go:117] "RemoveContainer" containerID="ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.046909 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2\": container with ID starting with ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2 not found: ID does not exist" containerID="ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.046967 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2"} err="failed to get container status \"ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2\": rpc error: code = NotFound desc = could not find container \"ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2\": container with ID starting with ee2889ca451fe6150a8e789c14ef8b494919ec7b8e25f382a09124b4f0bcb1e2 not found: ID does not exist" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.046996 4744 scope.go:117] "RemoveContainer" containerID="c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.047637 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0\": container with ID starting with c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0 not found: ID does not exist" containerID="c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.047670 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0"} err="failed to get container status \"c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0\": rpc error: code = NotFound desc = could not find container \"c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0\": container with ID starting with c2b6a52cd31b805e4441727ff36c7e6f099899faa8e4ec837f2b73fa24056fe0 not found: ID does not exist" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.047698 4744 scope.go:117] "RemoveContainer" containerID="f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.047983 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414\": container with ID starting with f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414 not found: ID does not exist" containerID="f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.048034 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414"} err="failed to get container status \"f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414\": rpc error: code = NotFound desc = could not find container \"f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414\": container with ID starting with f23228c76573abe332a6b36f454c2c5ce26e90396c0c27bc463d72642b10c414 not found: ID does not exist" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.048090 4744 scope.go:117] "RemoveContainer" containerID="1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.049733 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096\": container with ID starting with 1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096 not found: ID does not exist" containerID="1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.049762 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096"} err="failed to get container status \"1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096\": rpc error: code = NotFound desc = could not find container \"1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096\": container with ID starting with 1ea41ca5bdffdcadead514ae3fe3142cad7682eb12f1a8450b5ecb2c0a35e096 not found: ID does not exist" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.049776 4744 scope.go:117] "RemoveContainer" containerID="4a0f5cb143bcfa57cbcb5bcafed97ea70a4e3574637e8b06418ee825b8820047" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.121267 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59761566-208e-47fa-b4b1-987db294ba92-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.220110 4744 scope.go:117] "RemoveContainer" containerID="4ff69006dc4a5c42d13adad171a2a3135ef326af540050a2d29e2347ee5a8552" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.277912 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.287147 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.330581 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.330974 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f73e88-9ffe-4d46-9869-9e3b22e2054e" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.330991 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f73e88-9ffe-4d46-9869-9e3b22e2054e" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331000 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331006 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331020 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292dafaf-ce57-46a1-8430-7d4f7baf831c" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331027 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="292dafaf-ce57-46a1-8430-7d4f7baf831c" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331041 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="sg-core" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331047 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="sg-core" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331060 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-notification-agent" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331065 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-notification-agent" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331073 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435ec201-6b70-44c7-bafa-9e203803ed2b" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331079 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="435ec201-6b70-44c7-bafa-9e203803ed2b" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331087 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="proxy-httpd" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331093 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="proxy-httpd" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331104 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon-log" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331111 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon-log" Sep 30 03:14:16 crc kubenswrapper[4744]: E0930 03:14:16.331128 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-central-agent" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331135 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-central-agent" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331318 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="proxy-httpd" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331329 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon-log" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331337 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-central-agent" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331348 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f73e88-9ffe-4d46-9869-9e3b22e2054e" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331357 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="ceilometer-notification-agent" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331381 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" containerName="horizon" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331393 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="59761566-208e-47fa-b4b1-987db294ba92" containerName="sg-core" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331404 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="292dafaf-ce57-46a1-8430-7d4f7baf831c" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.331416 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="435ec201-6b70-44c7-bafa-9e203803ed2b" containerName="mariadb-database-create" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.332956 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.335345 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.350443 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.361803 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.426986 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.427081 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-scripts\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.427146 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.427211 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-log-httpd\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.427231 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7mkt\" (UniqueName: \"kubernetes.io/projected/9533e162-2cdb-4d89-9fbf-74320ba61bb3-kube-api-access-g7mkt\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.427257 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-run-httpd\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.427277 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-config-data\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529001 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-scripts\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529069 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529126 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-log-httpd\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529148 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7mkt\" (UniqueName: \"kubernetes.io/projected/9533e162-2cdb-4d89-9fbf-74320ba61bb3-kube-api-access-g7mkt\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529178 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-run-httpd\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529198 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-config-data\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529237 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529859 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-run-httpd\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.529917 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-log-httpd\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.532462 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-scripts\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.532937 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.533204 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.533440 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-config-data\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.550146 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7mkt\" (UniqueName: \"kubernetes.io/projected/9533e162-2cdb-4d89-9fbf-74320ba61bb3-kube-api-access-g7mkt\") pod \"ceilometer-0\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " pod="openstack/ceilometer-0" Sep 30 03:14:16 crc kubenswrapper[4744]: I0930 03:14:16.650276 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:17 crc kubenswrapper[4744]: I0930 03:14:17.206977 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:17 crc kubenswrapper[4744]: W0930 03:14:17.215647 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9533e162_2cdb_4d89_9fbf_74320ba61bb3.slice/crio-82535a5eeb6bd09d0b471d84d01451cc80ab5b08dbe5c11bf7353b75329bfa74 WatchSource:0}: Error finding container 82535a5eeb6bd09d0b471d84d01451cc80ab5b08dbe5c11bf7353b75329bfa74: Status 404 returned error can't find the container with id 82535a5eeb6bd09d0b471d84d01451cc80ab5b08dbe5c11bf7353b75329bfa74 Sep 30 03:14:17 crc kubenswrapper[4744]: I0930 03:14:17.515590 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f214ceb-c91a-4672-8711-9728a3f5e3f3" path="/var/lib/kubelet/pods/1f214ceb-c91a-4672-8711-9728a3f5e3f3/volumes" Sep 30 03:14:17 crc kubenswrapper[4744]: I0930 03:14:17.516841 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59761566-208e-47fa-b4b1-987db294ba92" path="/var/lib/kubelet/pods/59761566-208e-47fa-b4b1-987db294ba92/volumes" Sep 30 03:14:17 crc kubenswrapper[4744]: I0930 03:14:17.965810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerStarted","Data":"bd0a922f41ee31b10a262a1ce89767074917490b51c3b27c94d740ee6f495ca2"} Sep 30 03:14:17 crc kubenswrapper[4744]: I0930 03:14:17.965887 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerStarted","Data":"82535a5eeb6bd09d0b471d84d01451cc80ab5b08dbe5c11bf7353b75329bfa74"} Sep 30 03:14:18 crc kubenswrapper[4744]: I0930 03:14:18.454882 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Sep 30 03:14:18 crc kubenswrapper[4744]: I0930 03:14:18.977898 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerStarted","Data":"7b341f596d8f440a73801eb8de66fdb07c0054d4b334370d79afefb5b914386c"} Sep 30 03:14:18 crc kubenswrapper[4744]: I0930 03:14:18.984478 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:14:18 crc kubenswrapper[4744]: I0930 03:14:18.984680 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-log" containerID="cri-o://3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883" gracePeriod=30 Sep 30 03:14:18 crc kubenswrapper[4744]: I0930 03:14:18.984892 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-httpd" containerID="cri-o://5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade" gracePeriod=30 Sep 30 03:14:19 crc kubenswrapper[4744]: I0930 03:14:19.923243 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-22c7-account-create-qnv8l"] Sep 30 03:14:19 crc kubenswrapper[4744]: I0930 03:14:19.924947 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22c7-account-create-qnv8l" Sep 30 03:14:19 crc kubenswrapper[4744]: I0930 03:14:19.927200 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 03:14:19 crc kubenswrapper[4744]: I0930 03:14:19.936832 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-22c7-account-create-qnv8l"] Sep 30 03:14:19 crc kubenswrapper[4744]: I0930 03:14:19.997800 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njdvt\" (UniqueName: \"kubernetes.io/projected/21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8-kube-api-access-njdvt\") pod \"nova-api-22c7-account-create-qnv8l\" (UID: \"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8\") " pod="openstack/nova-api-22c7-account-create-qnv8l" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.002958 4744 generic.go:334] "Generic (PLEG): container finished" podID="d90c9655-5af0-4978-8c33-23be71d00047" containerID="3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883" exitCode=143 Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.003018 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d90c9655-5af0-4978-8c33-23be71d00047","Type":"ContainerDied","Data":"3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883"} Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.005005 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerStarted","Data":"b83a584faa63cdd26b51e52d7a2043caf2b1867d169e7e4224abc244eb47dde2"} Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.051711 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.051942 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-log" containerID="cri-o://847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec" gracePeriod=30 Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.052061 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-httpd" containerID="cri-o://4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2" gracePeriod=30 Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.099140 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njdvt\" (UniqueName: \"kubernetes.io/projected/21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8-kube-api-access-njdvt\") pod \"nova-api-22c7-account-create-qnv8l\" (UID: \"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8\") " pod="openstack/nova-api-22c7-account-create-qnv8l" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.114338 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e6bc-account-create-n922k"] Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.115648 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e6bc-account-create-n922k" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.121161 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.121322 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njdvt\" (UniqueName: \"kubernetes.io/projected/21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8-kube-api-access-njdvt\") pod \"nova-api-22c7-account-create-qnv8l\" (UID: \"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8\") " pod="openstack/nova-api-22c7-account-create-qnv8l" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.124124 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e6bc-account-create-n922k"] Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.201319 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjb9k\" (UniqueName: \"kubernetes.io/projected/d3272d22-2919-4fb7-98ea-9193216bcbd3-kube-api-access-fjb9k\") pod \"nova-cell0-e6bc-account-create-n922k\" (UID: \"d3272d22-2919-4fb7-98ea-9193216bcbd3\") " pod="openstack/nova-cell0-e6bc-account-create-n922k" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.241744 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22c7-account-create-qnv8l" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.303240 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjb9k\" (UniqueName: \"kubernetes.io/projected/d3272d22-2919-4fb7-98ea-9193216bcbd3-kube-api-access-fjb9k\") pod \"nova-cell0-e6bc-account-create-n922k\" (UID: \"d3272d22-2919-4fb7-98ea-9193216bcbd3\") " pod="openstack/nova-cell0-e6bc-account-create-n922k" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.319009 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-428f-account-create-wnn55"] Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.321011 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-428f-account-create-wnn55" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.323083 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.331939 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-428f-account-create-wnn55"] Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.354285 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjb9k\" (UniqueName: \"kubernetes.io/projected/d3272d22-2919-4fb7-98ea-9193216bcbd3-kube-api-access-fjb9k\") pod \"nova-cell0-e6bc-account-create-n922k\" (UID: \"d3272d22-2919-4fb7-98ea-9193216bcbd3\") " pod="openstack/nova-cell0-e6bc-account-create-n922k" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.406434 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2jw\" (UniqueName: \"kubernetes.io/projected/3465d7e4-f246-47a0-a809-d690670848f5-kube-api-access-mz2jw\") pod \"nova-cell1-428f-account-create-wnn55\" (UID: \"3465d7e4-f246-47a0-a809-d690670848f5\") " pod="openstack/nova-cell1-428f-account-create-wnn55" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.508311 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2jw\" (UniqueName: \"kubernetes.io/projected/3465d7e4-f246-47a0-a809-d690670848f5-kube-api-access-mz2jw\") pod \"nova-cell1-428f-account-create-wnn55\" (UID: \"3465d7e4-f246-47a0-a809-d690670848f5\") " pod="openstack/nova-cell1-428f-account-create-wnn55" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.512694 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e6bc-account-create-n922k" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.536499 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2jw\" (UniqueName: \"kubernetes.io/projected/3465d7e4-f246-47a0-a809-d690670848f5-kube-api-access-mz2jw\") pod \"nova-cell1-428f-account-create-wnn55\" (UID: \"3465d7e4-f246-47a0-a809-d690670848f5\") " pod="openstack/nova-cell1-428f-account-create-wnn55" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.739763 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-428f-account-create-wnn55" Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.740982 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-22c7-account-create-qnv8l"] Sep 30 03:14:20 crc kubenswrapper[4744]: W0930 03:14:20.744657 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21dd3df4_6b6f_48ea_80fa_8a9d9c6785e8.slice/crio-9659b959f130ed037c5a913e8ef69d7c962465f977f341c6750e6c68e9140884 WatchSource:0}: Error finding container 9659b959f130ed037c5a913e8ef69d7c962465f977f341c6750e6c68e9140884: Status 404 returned error can't find the container with id 9659b959f130ed037c5a913e8ef69d7c962465f977f341c6750e6c68e9140884 Sep 30 03:14:20 crc kubenswrapper[4744]: W0930 03:14:20.997530 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3272d22_2919_4fb7_98ea_9193216bcbd3.slice/crio-ff607c8d1d74435fc208ec31ac391b560e0e9c87d40b729da7191e47c7a0a962 WatchSource:0}: Error finding container ff607c8d1d74435fc208ec31ac391b560e0e9c87d40b729da7191e47c7a0a962: Status 404 returned error can't find the container with id ff607c8d1d74435fc208ec31ac391b560e0e9c87d40b729da7191e47c7a0a962 Sep 30 03:14:20 crc kubenswrapper[4744]: I0930 03:14:20.998788 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e6bc-account-create-n922k"] Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.035408 4744 generic.go:334] "Generic (PLEG): container finished" podID="21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8" containerID="be7cae86c5eccb5a76e18067e5661cfc3b620a095dac057bb3b488c517e2144f" exitCode=0 Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.035469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-22c7-account-create-qnv8l" event={"ID":"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8","Type":"ContainerDied","Data":"be7cae86c5eccb5a76e18067e5661cfc3b620a095dac057bb3b488c517e2144f"} Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.035495 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-22c7-account-create-qnv8l" event={"ID":"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8","Type":"ContainerStarted","Data":"9659b959f130ed037c5a913e8ef69d7c962465f977f341c6750e6c68e9140884"} Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.039948 4744 generic.go:334] "Generic (PLEG): container finished" podID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerID="847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec" exitCode=143 Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.040013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2daed0a-2a52-458f-a872-1f7b875e1a39","Type":"ContainerDied","Data":"847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec"} Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.041892 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerStarted","Data":"163c5487663dcea44a842f72fc4ba22b469500bb8848aca8e65c300b4608cd40"} Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.042666 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.043489 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e6bc-account-create-n922k" event={"ID":"d3272d22-2919-4fb7-98ea-9193216bcbd3","Type":"ContainerStarted","Data":"ff607c8d1d74435fc208ec31ac391b560e0e9c87d40b729da7191e47c7a0a962"} Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.095512 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6597889380000002 podStartE2EDuration="5.095492154s" podCreationTimestamp="2025-09-30 03:14:16 +0000 UTC" firstStartedPulling="2025-09-30 03:14:17.219817237 +0000 UTC m=+1184.393037211" lastFinishedPulling="2025-09-30 03:14:20.655520453 +0000 UTC m=+1187.828740427" observedRunningTime="2025-09-30 03:14:21.088694552 +0000 UTC m=+1188.261914516" watchObservedRunningTime="2025-09-30 03:14:21.095492154 +0000 UTC m=+1188.268712128" Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.198532 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-428f-account-create-wnn55"] Sep 30 03:14:21 crc kubenswrapper[4744]: I0930 03:14:21.791216 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.055988 4744 generic.go:334] "Generic (PLEG): container finished" podID="d3272d22-2919-4fb7-98ea-9193216bcbd3" containerID="137c7c9fe46767d0b0d131f578e64c5697d5242268518b209b4b739a02626e73" exitCode=0 Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.056047 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e6bc-account-create-n922k" event={"ID":"d3272d22-2919-4fb7-98ea-9193216bcbd3","Type":"ContainerDied","Data":"137c7c9fe46767d0b0d131f578e64c5697d5242268518b209b4b739a02626e73"} Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.058717 4744 generic.go:334] "Generic (PLEG): container finished" podID="3465d7e4-f246-47a0-a809-d690670848f5" containerID="0ed099b0bb12b047fec70953124f0245395bac3e510e85b762650789a71d8d71" exitCode=0 Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.058783 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-428f-account-create-wnn55" event={"ID":"3465d7e4-f246-47a0-a809-d690670848f5","Type":"ContainerDied","Data":"0ed099b0bb12b047fec70953124f0245395bac3e510e85b762650789a71d8d71"} Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.058824 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-428f-account-create-wnn55" event={"ID":"3465d7e4-f246-47a0-a809-d690670848f5","Type":"ContainerStarted","Data":"299e34081d0b78ebad56ebb4744c88db3aeaa9f4994a5d68808d5b366358f833"} Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.445059 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22c7-account-create-qnv8l" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.554760 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njdvt\" (UniqueName: \"kubernetes.io/projected/21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8-kube-api-access-njdvt\") pod \"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8\" (UID: \"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.570780 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8-kube-api-access-njdvt" (OuterVolumeSpecName: "kube-api-access-njdvt") pod "21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8" (UID: "21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8"). InnerVolumeSpecName "kube-api-access-njdvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.656971 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njdvt\" (UniqueName: \"kubernetes.io/projected/21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8-kube-api-access-njdvt\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.657608 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.758975 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-combined-ca-bundle\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759158 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-config-data\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759205 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-logs\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759244 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-ceph\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759269 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-scripts\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759290 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759321 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-public-tls-certs\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759365 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-httpd-run\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.759573 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkr6s\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-kube-api-access-qkr6s\") pod \"d90c9655-5af0-4978-8c33-23be71d00047\" (UID: \"d90c9655-5af0-4978-8c33-23be71d00047\") " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.760206 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-logs" (OuterVolumeSpecName: "logs") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.760660 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.767497 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-ceph" (OuterVolumeSpecName: "ceph") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.773574 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-scripts" (OuterVolumeSpecName: "scripts") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.786550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-kube-api-access-qkr6s" (OuterVolumeSpecName: "kube-api-access-qkr6s") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "kube-api-access-qkr6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.828519 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.864816 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.865013 4744 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.865124 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.865201 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.865271 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d90c9655-5af0-4978-8c33-23be71d00047-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.865325 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkr6s\" (UniqueName: \"kubernetes.io/projected/d90c9655-5af0-4978-8c33-23be71d00047-kube-api-access-qkr6s\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.887555 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.898453 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.937344 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.946621 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-config-data" (OuterVolumeSpecName: "config-data") pod "d90c9655-5af0-4978-8c33-23be71d00047" (UID: "d90c9655-5af0-4978-8c33-23be71d00047"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.969493 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.969521 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.969531 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:22 crc kubenswrapper[4744]: I0930 03:14:22.969540 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90c9655-5af0-4978-8c33-23be71d00047-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.070208 4744 generic.go:334] "Generic (PLEG): container finished" podID="d90c9655-5af0-4978-8c33-23be71d00047" containerID="5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade" exitCode=0 Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.070291 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d90c9655-5af0-4978-8c33-23be71d00047","Type":"ContainerDied","Data":"5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade"} Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.070302 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.070438 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d90c9655-5af0-4978-8c33-23be71d00047","Type":"ContainerDied","Data":"8be5408d7809b20c1c8b035b6f38cb0ec660fe0e50045a15ea836391e097be46"} Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.070466 4744 scope.go:117] "RemoveContainer" containerID="5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.072178 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-22c7-account-create-qnv8l" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.074694 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-22c7-account-create-qnv8l" event={"ID":"21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8","Type":"ContainerDied","Data":"9659b959f130ed037c5a913e8ef69d7c962465f977f341c6750e6c68e9140884"} Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.074738 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9659b959f130ed037c5a913e8ef69d7c962465f977f341c6750e6c68e9140884" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.074970 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-central-agent" containerID="cri-o://bd0a922f41ee31b10a262a1ce89767074917490b51c3b27c94d740ee6f495ca2" gracePeriod=30 Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.075108 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="proxy-httpd" containerID="cri-o://163c5487663dcea44a842f72fc4ba22b469500bb8848aca8e65c300b4608cd40" gracePeriod=30 Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.075152 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="sg-core" containerID="cri-o://b83a584faa63cdd26b51e52d7a2043caf2b1867d169e7e4224abc244eb47dde2" gracePeriod=30 Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.075208 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-notification-agent" containerID="cri-o://7b341f596d8f440a73801eb8de66fdb07c0054d4b334370d79afefb5b914386c" gracePeriod=30 Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.106828 4744 scope.go:117] "RemoveContainer" containerID="3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.112476 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.120784 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.136952 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:14:23 crc kubenswrapper[4744]: E0930 03:14:23.137354 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-httpd" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.137390 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-httpd" Sep 30 03:14:23 crc kubenswrapper[4744]: E0930 03:14:23.137400 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8" containerName="mariadb-account-create" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.137406 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8" containerName="mariadb-account-create" Sep 30 03:14:23 crc kubenswrapper[4744]: E0930 03:14:23.137418 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-log" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.137424 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-log" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.137581 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-httpd" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.137597 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90c9655-5af0-4978-8c33-23be71d00047" containerName="glance-log" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.137607 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8" containerName="mariadb-account-create" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.138567 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.140586 4744 scope.go:117] "RemoveContainer" containerID="5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade" Sep 30 03:14:23 crc kubenswrapper[4744]: E0930 03:14:23.144653 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade\": container with ID starting with 5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade not found: ID does not exist" containerID="5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.144701 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade"} err="failed to get container status \"5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade\": rpc error: code = NotFound desc = could not find container \"5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade\": container with ID starting with 5a95466765e3631a17cf0bace2fbbef3ef11c998b91661590ea5fa79b7d4fade not found: ID does not exist" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.144753 4744 scope.go:117] "RemoveContainer" containerID="3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.144955 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.146377 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 03:14:23 crc kubenswrapper[4744]: E0930 03:14:23.149480 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883\": container with ID starting with 3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883 not found: ID does not exist" containerID="3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.149516 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883"} err="failed to get container status \"3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883\": rpc error: code = NotFound desc = could not find container \"3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883\": container with ID starting with 3274d981360a0c3c94fe340f1f4bd7793238e5593fdf767f8a23311b15479883 not found: ID does not exist" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.185013 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.298815 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299380 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a6b749-14c4-4726-b176-160667e2651d-ceph\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299413 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299451 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299505 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299523 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2t4c\" (UniqueName: \"kubernetes.io/projected/f6a6b749-14c4-4726-b176-160667e2651d-kube-api-access-g2t4c\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299553 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a6b749-14c4-4726-b176-160667e2651d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299623 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.299698 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a6b749-14c4-4726-b176-160667e2651d-logs\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a6b749-14c4-4726-b176-160667e2651d-logs\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402579 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402601 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a6b749-14c4-4726-b176-160667e2651d-ceph\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402626 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402646 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402694 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2t4c\" (UniqueName: \"kubernetes.io/projected/f6a6b749-14c4-4726-b176-160667e2651d-kube-api-access-g2t4c\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402742 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a6b749-14c4-4726-b176-160667e2651d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.402760 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.408699 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.410000 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6a6b749-14c4-4726-b176-160667e2651d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.410391 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.410672 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a6b749-14c4-4726-b176-160667e2651d-logs\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.411359 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.413547 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f6a6b749-14c4-4726-b176-160667e2651d-ceph\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.415993 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.421934 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a6b749-14c4-4726-b176-160667e2651d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.447821 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2t4c\" (UniqueName: \"kubernetes.io/projected/f6a6b749-14c4-4726-b176-160667e2651d-kube-api-access-g2t4c\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.552656 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90c9655-5af0-4978-8c33-23be71d00047" path="/var/lib/kubelet/pods/d90c9655-5af0-4978-8c33-23be71d00047/volumes" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.633645 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"f6a6b749-14c4-4726-b176-160667e2651d\") " pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.695207 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e6bc-account-create-n922k" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.697619 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-428f-account-create-wnn55" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.787151 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.793555 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.834961 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz2jw\" (UniqueName: \"kubernetes.io/projected/3465d7e4-f246-47a0-a809-d690670848f5-kube-api-access-mz2jw\") pod \"3465d7e4-f246-47a0-a809-d690670848f5\" (UID: \"3465d7e4-f246-47a0-a809-d690670848f5\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.835118 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjb9k\" (UniqueName: \"kubernetes.io/projected/d3272d22-2919-4fb7-98ea-9193216bcbd3-kube-api-access-fjb9k\") pod \"d3272d22-2919-4fb7-98ea-9193216bcbd3\" (UID: \"d3272d22-2919-4fb7-98ea-9193216bcbd3\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.838424 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3272d22-2919-4fb7-98ea-9193216bcbd3-kube-api-access-fjb9k" (OuterVolumeSpecName: "kube-api-access-fjb9k") pod "d3272d22-2919-4fb7-98ea-9193216bcbd3" (UID: "d3272d22-2919-4fb7-98ea-9193216bcbd3"). InnerVolumeSpecName "kube-api-access-fjb9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.842585 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3465d7e4-f246-47a0-a809-d690670848f5-kube-api-access-mz2jw" (OuterVolumeSpecName: "kube-api-access-mz2jw") pod "3465d7e4-f246-47a0-a809-d690670848f5" (UID: "3465d7e4-f246-47a0-a809-d690670848f5"). InnerVolumeSpecName "kube-api-access-mz2jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936166 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-httpd-run\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936462 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-logs\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936506 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-config-data\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936524 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-internal-tls-certs\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936631 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9xlj\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-kube-api-access-w9xlj\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936680 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936702 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-combined-ca-bundle\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936734 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-scripts\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.936769 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-ceph\") pod \"a2daed0a-2a52-458f-a872-1f7b875e1a39\" (UID: \"a2daed0a-2a52-458f-a872-1f7b875e1a39\") " Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.937167 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz2jw\" (UniqueName: \"kubernetes.io/projected/3465d7e4-f246-47a0-a809-d690670848f5-kube-api-access-mz2jw\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.937178 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjb9k\" (UniqueName: \"kubernetes.io/projected/d3272d22-2919-4fb7-98ea-9193216bcbd3-kube-api-access-fjb9k\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.940235 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-ceph" (OuterVolumeSpecName: "ceph") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.940477 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.940688 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-logs" (OuterVolumeSpecName: "logs") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.946191 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-scripts" (OuterVolumeSpecName: "scripts") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.948610 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.950645 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-kube-api-access-w9xlj" (OuterVolumeSpecName: "kube-api-access-w9xlj") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "kube-api-access-w9xlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:23 crc kubenswrapper[4744]: I0930 03:14:23.991455 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.011899 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.028873 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-config-data" (OuterVolumeSpecName: "config-data") pod "a2daed0a-2a52-458f-a872-1f7b875e1a39" (UID: "a2daed0a-2a52-458f-a872-1f7b875e1a39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038890 4744 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038923 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2daed0a-2a52-458f-a872-1f7b875e1a39-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038932 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038942 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038951 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9xlj\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-kube-api-access-w9xlj\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038981 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038990 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.038998 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2daed0a-2a52-458f-a872-1f7b875e1a39-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.039005 4744 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a2daed0a-2a52-458f-a872-1f7b875e1a39-ceph\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.065093 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.096817 4744 generic.go:334] "Generic (PLEG): container finished" podID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerID="4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2" exitCode=0 Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.096856 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2daed0a-2a52-458f-a872-1f7b875e1a39","Type":"ContainerDied","Data":"4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2"} Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.096884 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.096914 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a2daed0a-2a52-458f-a872-1f7b875e1a39","Type":"ContainerDied","Data":"1e8ecd2101d12b8a8a6e354f29903c43f5a6a84a7a8dc0dcd250a806576a7859"} Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.096945 4744 scope.go:117] "RemoveContainer" containerID="4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.107981 4744 generic.go:334] "Generic (PLEG): container finished" podID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerID="163c5487663dcea44a842f72fc4ba22b469500bb8848aca8e65c300b4608cd40" exitCode=0 Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.108008 4744 generic.go:334] "Generic (PLEG): container finished" podID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerID="b83a584faa63cdd26b51e52d7a2043caf2b1867d169e7e4224abc244eb47dde2" exitCode=2 Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.108017 4744 generic.go:334] "Generic (PLEG): container finished" podID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerID="7b341f596d8f440a73801eb8de66fdb07c0054d4b334370d79afefb5b914386c" exitCode=0 Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.108051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerDied","Data":"163c5487663dcea44a842f72fc4ba22b469500bb8848aca8e65c300b4608cd40"} Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.108073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerDied","Data":"b83a584faa63cdd26b51e52d7a2043caf2b1867d169e7e4224abc244eb47dde2"} Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.108083 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerDied","Data":"7b341f596d8f440a73801eb8de66fdb07c0054d4b334370d79afefb5b914386c"} Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.109834 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-428f-account-create-wnn55" event={"ID":"3465d7e4-f246-47a0-a809-d690670848f5","Type":"ContainerDied","Data":"299e34081d0b78ebad56ebb4744c88db3aeaa9f4994a5d68808d5b366358f833"} Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.109859 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299e34081d0b78ebad56ebb4744c88db3aeaa9f4994a5d68808d5b366358f833" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.109920 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-428f-account-create-wnn55" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.133764 4744 scope.go:117] "RemoveContainer" containerID="847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.134152 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e6bc-account-create-n922k" event={"ID":"d3272d22-2919-4fb7-98ea-9193216bcbd3","Type":"ContainerDied","Data":"ff607c8d1d74435fc208ec31ac391b560e0e9c87d40b729da7191e47c7a0a962"} Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.134184 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff607c8d1d74435fc208ec31ac391b560e0e9c87d40b729da7191e47c7a0a962" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.134234 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e6bc-account-create-n922k" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.137258 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.140930 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.146744 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.156183 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:14:24 crc kubenswrapper[4744]: E0930 03:14:24.156538 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3272d22-2919-4fb7-98ea-9193216bcbd3" containerName="mariadb-account-create" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.156555 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3272d22-2919-4fb7-98ea-9193216bcbd3" containerName="mariadb-account-create" Sep 30 03:14:24 crc kubenswrapper[4744]: E0930 03:14:24.156576 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3465d7e4-f246-47a0-a809-d690670848f5" containerName="mariadb-account-create" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.156583 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3465d7e4-f246-47a0-a809-d690670848f5" containerName="mariadb-account-create" Sep 30 03:14:24 crc kubenswrapper[4744]: E0930 03:14:24.156605 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-httpd" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.156611 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-httpd" Sep 30 03:14:24 crc kubenswrapper[4744]: E0930 03:14:24.156624 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-log" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.156629 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-log" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.157101 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3465d7e4-f246-47a0-a809-d690670848f5" containerName="mariadb-account-create" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.157125 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-log" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.157135 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" containerName="glance-httpd" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.157144 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3272d22-2919-4fb7-98ea-9193216bcbd3" containerName="mariadb-account-create" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.158004 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.161557 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.165798 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.177766 4744 scope.go:117] "RemoveContainer" containerID="4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2" Sep 30 03:14:24 crc kubenswrapper[4744]: E0930 03:14:24.179682 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2\": container with ID starting with 4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2 not found: ID does not exist" containerID="4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.179736 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2"} err="failed to get container status \"4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2\": rpc error: code = NotFound desc = could not find container \"4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2\": container with ID starting with 4532ebdc602f827814f722a7e2c687cd83e677b868497671a7f57fa31e587ca2 not found: ID does not exist" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.179762 4744 scope.go:117] "RemoveContainer" containerID="847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec" Sep 30 03:14:24 crc kubenswrapper[4744]: E0930 03:14:24.182704 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec\": container with ID starting with 847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec not found: ID does not exist" containerID="847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.182740 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec"} err="failed to get container status \"847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec\": rpc error: code = NotFound desc = could not find container \"847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec\": container with ID starting with 847f29ae20ff3aa00cd991fe6ce8ae8f7da4228a13c16ccb87fbedd49d8b19ec not found: ID does not exist" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.190188 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242670 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242721 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242760 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242807 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242840 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242865 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242881 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242922 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.242946 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmk2\" (UniqueName: \"kubernetes.io/projected/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-kube-api-access-xpmk2\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344569 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344620 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344637 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344678 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344704 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmk2\" (UniqueName: \"kubernetes.io/projected/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-kube-api-access-xpmk2\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344751 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344779 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344814 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.344858 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.345290 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.346880 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.347508 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-logs\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.351954 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.352554 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.352891 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.353844 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.361556 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.374612 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmk2\" (UniqueName: \"kubernetes.io/projected/8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa-kube-api-access-xpmk2\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.379452 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.407668 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa\") " pod="openstack/glance-default-internal-api-0" Sep 30 03:14:24 crc kubenswrapper[4744]: I0930 03:14:24.479832 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.074991 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.160335 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa","Type":"ContainerStarted","Data":"bc1e197fcf57e17c5868facdb4e8f6a18438782b2378ce84941e3d022a9037b5"} Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.163664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a6b749-14c4-4726-b176-160667e2651d","Type":"ContainerStarted","Data":"3d9f6eaf6188cb9d9eb297a0d12dd6a193344b31429709493c98fa4efe6a6629"} Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.163693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a6b749-14c4-4726-b176-160667e2651d","Type":"ContainerStarted","Data":"9c436bdcc84e62012617c87733c2ba4f4389600421e50219f0dc56f3b2a3fbff"} Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.207290 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.575188 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2daed0a-2a52-458f-a872-1f7b875e1a39" path="/var/lib/kubelet/pods/a2daed0a-2a52-458f-a872-1f7b875e1a39/volumes" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.576295 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77vc8"] Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.577667 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77vc8"] Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.577742 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.585668 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7xqr2" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.585863 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.585984 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.679192 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.679324 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-scripts\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.679434 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-config-data\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.679652 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvct\" (UniqueName: \"kubernetes.io/projected/23e0e876-4407-4f01-8f41-31a30f8dbb93-kube-api-access-jhvct\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.781385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-scripts\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.781502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-config-data\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.781580 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvct\" (UniqueName: \"kubernetes.io/projected/23e0e876-4407-4f01-8f41-31a30f8dbb93-kube-api-access-jhvct\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.781646 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.793050 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-config-data\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.793576 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-scripts\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.793689 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.805811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvct\" (UniqueName: \"kubernetes.io/projected/23e0e876-4407-4f01-8f41-31a30f8dbb93-kube-api-access-jhvct\") pod \"nova-cell0-conductor-db-sync-77vc8\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:25 crc kubenswrapper[4744]: I0930 03:14:25.911777 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.197070 4744 generic.go:334] "Generic (PLEG): container finished" podID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerID="bd0a922f41ee31b10a262a1ce89767074917490b51c3b27c94d740ee6f495ca2" exitCode=0 Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.197215 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerDied","Data":"bd0a922f41ee31b10a262a1ce89767074917490b51c3b27c94d740ee6f495ca2"} Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.203490 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6a6b749-14c4-4726-b176-160667e2651d","Type":"ContainerStarted","Data":"550daa12b4607d64b3eade6cf7e644de064c9bb42a837ddbfa3015985b700e85"} Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.215992 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa","Type":"ContainerStarted","Data":"622cf4b88b460717a247fa9fc657800727019416d2468ebcb6855bbb551d35be"} Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.245428 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.265029 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.265007822 podStartE2EDuration="3.265007822s" podCreationTimestamp="2025-09-30 03:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:14:26.236504958 +0000 UTC m=+1193.409724932" watchObservedRunningTime="2025-09-30 03:14:26.265007822 +0000 UTC m=+1193.438227796" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.393327 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-run-httpd\") pod \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.393437 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-log-httpd\") pod \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.393466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7mkt\" (UniqueName: \"kubernetes.io/projected/9533e162-2cdb-4d89-9fbf-74320ba61bb3-kube-api-access-g7mkt\") pod \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.393801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-sg-core-conf-yaml\") pod \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.393829 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9533e162-2cdb-4d89-9fbf-74320ba61bb3" (UID: "9533e162-2cdb-4d89-9fbf-74320ba61bb3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.393994 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-config-data\") pod \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.394049 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9533e162-2cdb-4d89-9fbf-74320ba61bb3" (UID: "9533e162-2cdb-4d89-9fbf-74320ba61bb3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.394068 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-scripts\") pod \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.394108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-combined-ca-bundle\") pod \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\" (UID: \"9533e162-2cdb-4d89-9fbf-74320ba61bb3\") " Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.394899 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.394916 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9533e162-2cdb-4d89-9fbf-74320ba61bb3-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.399310 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9533e162-2cdb-4d89-9fbf-74320ba61bb3-kube-api-access-g7mkt" (OuterVolumeSpecName: "kube-api-access-g7mkt") pod "9533e162-2cdb-4d89-9fbf-74320ba61bb3" (UID: "9533e162-2cdb-4d89-9fbf-74320ba61bb3"). InnerVolumeSpecName "kube-api-access-g7mkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.414609 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-scripts" (OuterVolumeSpecName: "scripts") pod "9533e162-2cdb-4d89-9fbf-74320ba61bb3" (UID: "9533e162-2cdb-4d89-9fbf-74320ba61bb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.448811 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9533e162-2cdb-4d89-9fbf-74320ba61bb3" (UID: "9533e162-2cdb-4d89-9fbf-74320ba61bb3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.492756 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77vc8"] Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.496456 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7mkt\" (UniqueName: \"kubernetes.io/projected/9533e162-2cdb-4d89-9fbf-74320ba61bb3-kube-api-access-g7mkt\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.496488 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.496498 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.510153 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9533e162-2cdb-4d89-9fbf-74320ba61bb3" (UID: "9533e162-2cdb-4d89-9fbf-74320ba61bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.561286 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-config-data" (OuterVolumeSpecName: "config-data") pod "9533e162-2cdb-4d89-9fbf-74320ba61bb3" (UID: "9533e162-2cdb-4d89-9fbf-74320ba61bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.599712 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:26 crc kubenswrapper[4744]: I0930 03:14:26.599736 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9533e162-2cdb-4d89-9fbf-74320ba61bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.223149 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa","Type":"ContainerStarted","Data":"38bd965e033a9a042426a2960ee0abde5e031ea30b33a8eefd5dd4c9de1c9dac"} Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.225297 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77vc8" event={"ID":"23e0e876-4407-4f01-8f41-31a30f8dbb93","Type":"ContainerStarted","Data":"cb64643b5a6e58f65c8c65cbdf1f12abadc291651d5be3bb1bf4e1c2990aafa6"} Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.228073 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.232692 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9533e162-2cdb-4d89-9fbf-74320ba61bb3","Type":"ContainerDied","Data":"82535a5eeb6bd09d0b471d84d01451cc80ab5b08dbe5c11bf7353b75329bfa74"} Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.232739 4744 scope.go:117] "RemoveContainer" containerID="163c5487663dcea44a842f72fc4ba22b469500bb8848aca8e65c300b4608cd40" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.258916 4744 scope.go:117] "RemoveContainer" containerID="b83a584faa63cdd26b51e52d7a2043caf2b1867d169e7e4224abc244eb47dde2" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.262803 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.262778269 podStartE2EDuration="3.262778269s" podCreationTimestamp="2025-09-30 03:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:14:27.252614963 +0000 UTC m=+1194.425834957" watchObservedRunningTime="2025-09-30 03:14:27.262778269 +0000 UTC m=+1194.435998243" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.283340 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.292815 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310072 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:27 crc kubenswrapper[4744]: E0930 03:14:27.310533 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="sg-core" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310551 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="sg-core" Sep 30 03:14:27 crc kubenswrapper[4744]: E0930 03:14:27.310561 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-central-agent" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310568 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-central-agent" Sep 30 03:14:27 crc kubenswrapper[4744]: E0930 03:14:27.310591 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-notification-agent" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310598 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-notification-agent" Sep 30 03:14:27 crc kubenswrapper[4744]: E0930 03:14:27.310616 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="proxy-httpd" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310622 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="proxy-httpd" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310789 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-notification-agent" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310808 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="ceilometer-central-agent" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310819 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="proxy-httpd" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.310834 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" containerName="sg-core" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.312623 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.316675 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.317819 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.333535 4744 scope.go:117] "RemoveContainer" containerID="7b341f596d8f440a73801eb8de66fdb07c0054d4b334370d79afefb5b914386c" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.333999 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.375775 4744 scope.go:117] "RemoveContainer" containerID="bd0a922f41ee31b10a262a1ce89767074917490b51c3b27c94d740ee6f495ca2" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.413286 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-log-httpd\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.413349 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.413395 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-config-data\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.413436 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-scripts\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.413462 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55jwg\" (UniqueName: \"kubernetes.io/projected/93be78c9-96eb-4efb-9701-d2d8af3d38cd-kube-api-access-55jwg\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.413531 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.413555 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-run-httpd\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.514476 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9533e162-2cdb-4d89-9fbf-74320ba61bb3" path="/var/lib/kubelet/pods/9533e162-2cdb-4d89-9fbf-74320ba61bb3/volumes" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.515010 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-scripts\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.515094 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55jwg\" (UniqueName: \"kubernetes.io/projected/93be78c9-96eb-4efb-9701-d2d8af3d38cd-kube-api-access-55jwg\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.515250 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.515385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-run-httpd\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.515482 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-log-httpd\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.515528 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.515574 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-config-data\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.517520 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-run-httpd\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.519419 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.520925 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-config-data\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.521931 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-log-httpd\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.529953 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-scripts\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.530041 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.532169 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55jwg\" (UniqueName: \"kubernetes.io/projected/93be78c9-96eb-4efb-9701-d2d8af3d38cd-kube-api-access-55jwg\") pod \"ceilometer-0\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " pod="openstack/ceilometer-0" Sep 30 03:14:27 crc kubenswrapper[4744]: I0930 03:14:27.640582 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:28 crc kubenswrapper[4744]: I0930 03:14:28.104352 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:28 crc kubenswrapper[4744]: W0930 03:14:28.117767 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93be78c9_96eb_4efb_9701_d2d8af3d38cd.slice/crio-a6d749eed3f8edc90881ca8fa8ae2faa8ef3b49892a6a577705390479e491f55 WatchSource:0}: Error finding container a6d749eed3f8edc90881ca8fa8ae2faa8ef3b49892a6a577705390479e491f55: Status 404 returned error can't find the container with id a6d749eed3f8edc90881ca8fa8ae2faa8ef3b49892a6a577705390479e491f55 Sep 30 03:14:28 crc kubenswrapper[4744]: I0930 03:14:28.242011 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerStarted","Data":"a6d749eed3f8edc90881ca8fa8ae2faa8ef3b49892a6a577705390479e491f55"} Sep 30 03:14:29 crc kubenswrapper[4744]: I0930 03:14:29.258226 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerStarted","Data":"35c26549da12c14a6703728df0b889038f271440bf251e6fb296fab25bb5bee6"} Sep 30 03:14:29 crc kubenswrapper[4744]: I0930 03:14:29.966960 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Sep 30 03:14:30 crc kubenswrapper[4744]: I0930 03:14:30.268841 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerStarted","Data":"ade752918b23978451e40c6578d21ad7a35ca56bb2fbe1cf4760f6ebe6fdc12e"} Sep 30 03:14:33 crc kubenswrapper[4744]: I0930 03:14:33.787584 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 03:14:33 crc kubenswrapper[4744]: I0930 03:14:33.788067 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 03:14:33 crc kubenswrapper[4744]: I0930 03:14:33.832500 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 03:14:33 crc kubenswrapper[4744]: I0930 03:14:33.833047 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 03:14:34 crc kubenswrapper[4744]: I0930 03:14:34.301263 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 03:14:34 crc kubenswrapper[4744]: I0930 03:14:34.301584 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 03:14:34 crc kubenswrapper[4744]: I0930 03:14:34.481511 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:34 crc kubenswrapper[4744]: I0930 03:14:34.481557 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:34 crc kubenswrapper[4744]: I0930 03:14:34.526297 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:34 crc kubenswrapper[4744]: I0930 03:14:34.563293 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:34 crc kubenswrapper[4744]: I0930 03:14:34.653857 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:14:35 crc kubenswrapper[4744]: I0930 03:14:35.150930 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:35 crc kubenswrapper[4744]: I0930 03:14:35.307860 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:35 crc kubenswrapper[4744]: I0930 03:14:35.308195 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:36 crc kubenswrapper[4744]: I0930 03:14:36.317229 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerStarted","Data":"5443defb717ddcefd27734571d9174107995f0f218cf60b11a8f197aceb1b2f7"} Sep 30 03:14:36 crc kubenswrapper[4744]: I0930 03:14:36.319016 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77vc8" event={"ID":"23e0e876-4407-4f01-8f41-31a30f8dbb93","Type":"ContainerStarted","Data":"3250bea0036aed0d33edbfb5fc8a9a3369cf28bc3db4c5e4149f5b741d8fd39a"} Sep 30 03:14:36 crc kubenswrapper[4744]: I0930 03:14:36.319088 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 03:14:36 crc kubenswrapper[4744]: I0930 03:14:36.319109 4744 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 03:14:36 crc kubenswrapper[4744]: I0930 03:14:36.334239 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-77vc8" podStartSLOduration=2.583672726 podStartE2EDuration="11.334222809s" podCreationTimestamp="2025-09-30 03:14:25 +0000 UTC" firstStartedPulling="2025-09-30 03:14:26.464280215 +0000 UTC m=+1193.637500189" lastFinishedPulling="2025-09-30 03:14:35.214830298 +0000 UTC m=+1202.388050272" observedRunningTime="2025-09-30 03:14:36.331402261 +0000 UTC m=+1203.504622235" watchObservedRunningTime="2025-09-30 03:14:36.334222809 +0000 UTC m=+1203.507442783" Sep 30 03:14:36 crc kubenswrapper[4744]: I0930 03:14:36.417003 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 03:14:36 crc kubenswrapper[4744]: I0930 03:14:36.616641 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.117974 4744 scope.go:117] "RemoveContainer" containerID="4e5d0d5fef3e4f1fcf0de3bb0e095d24ff04ea33b9dabfbeb79349159a0d0565" Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.147197 4744 scope.go:117] "RemoveContainer" containerID="b2097a6dff1368e4988b9c3fd880635dbe95bd3470bc2ec7fe7ad537a6802c7c" Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.210730 4744 scope.go:117] "RemoveContainer" containerID="01e1cfa52bcbdeedd1f0612c7d5ff98e95cafb6b0689bc34899a7628beb623b8" Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.326584 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.332758 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.364552 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-central-agent" containerID="cri-o://35c26549da12c14a6703728df0b889038f271440bf251e6fb296fab25bb5bee6" gracePeriod=30 Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.364696 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerStarted","Data":"d7a2e93ea06c397908b6b4e7c2c93771dbb75278841aef10882c7c76e56ebbb7"} Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.364871 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="proxy-httpd" containerID="cri-o://d7a2e93ea06c397908b6b4e7c2c93771dbb75278841aef10882c7c76e56ebbb7" gracePeriod=30 Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.365092 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.365932 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-notification-agent" containerID="cri-o://ade752918b23978451e40c6578d21ad7a35ca56bb2fbe1cf4760f6ebe6fdc12e" gracePeriod=30 Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.366022 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="sg-core" containerID="cri-o://5443defb717ddcefd27734571d9174107995f0f218cf60b11a8f197aceb1b2f7" gracePeriod=30 Sep 30 03:14:37 crc kubenswrapper[4744]: I0930 03:14:37.433707 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.989557204 podStartE2EDuration="10.433686411s" podCreationTimestamp="2025-09-30 03:14:27 +0000 UTC" firstStartedPulling="2025-09-30 03:14:28.122711289 +0000 UTC m=+1195.295931263" lastFinishedPulling="2025-09-30 03:14:36.566840506 +0000 UTC m=+1203.740060470" observedRunningTime="2025-09-30 03:14:37.421613426 +0000 UTC m=+1204.594833410" watchObservedRunningTime="2025-09-30 03:14:37.433686411 +0000 UTC m=+1204.606906385" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.377202 4744 generic.go:334] "Generic (PLEG): container finished" podID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerID="d7a2e93ea06c397908b6b4e7c2c93771dbb75278841aef10882c7c76e56ebbb7" exitCode=0 Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.377756 4744 generic.go:334] "Generic (PLEG): container finished" podID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerID="5443defb717ddcefd27734571d9174107995f0f218cf60b11a8f197aceb1b2f7" exitCode=2 Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.377765 4744 generic.go:334] "Generic (PLEG): container finished" podID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerID="ade752918b23978451e40c6578d21ad7a35ca56bb2fbe1cf4760f6ebe6fdc12e" exitCode=0 Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.377772 4744 generic.go:334] "Generic (PLEG): container finished" podID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerID="35c26549da12c14a6703728df0b889038f271440bf251e6fb296fab25bb5bee6" exitCode=0 Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.378742 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerDied","Data":"d7a2e93ea06c397908b6b4e7c2c93771dbb75278841aef10882c7c76e56ebbb7"} Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.378768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerDied","Data":"5443defb717ddcefd27734571d9174107995f0f218cf60b11a8f197aceb1b2f7"} Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.378779 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerDied","Data":"ade752918b23978451e40c6578d21ad7a35ca56bb2fbe1cf4760f6ebe6fdc12e"} Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.378787 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerDied","Data":"35c26549da12c14a6703728df0b889038f271440bf251e6fb296fab25bb5bee6"} Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.494676 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.536975 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-run-httpd\") pod \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.537230 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-sg-core-conf-yaml\") pod \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.537344 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93be78c9-96eb-4efb-9701-d2d8af3d38cd" (UID: "93be78c9-96eb-4efb-9701-d2d8af3d38cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.537537 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-combined-ca-bundle\") pod \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.537596 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-scripts\") pod \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.537649 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55jwg\" (UniqueName: \"kubernetes.io/projected/93be78c9-96eb-4efb-9701-d2d8af3d38cd-kube-api-access-55jwg\") pod \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.537702 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-config-data\") pod \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.537792 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-log-httpd\") pod \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\" (UID: \"93be78c9-96eb-4efb-9701-d2d8af3d38cd\") " Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.538516 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.539681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93be78c9-96eb-4efb-9701-d2d8af3d38cd" (UID: "93be78c9-96eb-4efb-9701-d2d8af3d38cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.543520 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93be78c9-96eb-4efb-9701-d2d8af3d38cd-kube-api-access-55jwg" (OuterVolumeSpecName: "kube-api-access-55jwg") pod "93be78c9-96eb-4efb-9701-d2d8af3d38cd" (UID: "93be78c9-96eb-4efb-9701-d2d8af3d38cd"). InnerVolumeSpecName "kube-api-access-55jwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.543579 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-scripts" (OuterVolumeSpecName: "scripts") pod "93be78c9-96eb-4efb-9701-d2d8af3d38cd" (UID: "93be78c9-96eb-4efb-9701-d2d8af3d38cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.571879 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93be78c9-96eb-4efb-9701-d2d8af3d38cd" (UID: "93be78c9-96eb-4efb-9701-d2d8af3d38cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.640850 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93be78c9-96eb-4efb-9701-d2d8af3d38cd-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.640874 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.640884 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.640892 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55jwg\" (UniqueName: \"kubernetes.io/projected/93be78c9-96eb-4efb-9701-d2d8af3d38cd-kube-api-access-55jwg\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.643381 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93be78c9-96eb-4efb-9701-d2d8af3d38cd" (UID: "93be78c9-96eb-4efb-9701-d2d8af3d38cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.661825 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-config-data" (OuterVolumeSpecName: "config-data") pod "93be78c9-96eb-4efb-9701-d2d8af3d38cd" (UID: "93be78c9-96eb-4efb-9701-d2d8af3d38cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.742790 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:38 crc kubenswrapper[4744]: I0930 03:14:38.742824 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93be78c9-96eb-4efb-9701-d2d8af3d38cd-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.391769 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93be78c9-96eb-4efb-9701-d2d8af3d38cd","Type":"ContainerDied","Data":"a6d749eed3f8edc90881ca8fa8ae2faa8ef3b49892a6a577705390479e491f55"} Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.392053 4744 scope.go:117] "RemoveContainer" containerID="d7a2e93ea06c397908b6b4e7c2c93771dbb75278841aef10882c7c76e56ebbb7" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.391936 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.418284 4744 scope.go:117] "RemoveContainer" containerID="5443defb717ddcefd27734571d9174107995f0f218cf60b11a8f197aceb1b2f7" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.430982 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.447458 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.456995 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:39 crc kubenswrapper[4744]: E0930 03:14:39.457417 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="proxy-httpd" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457435 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="proxy-httpd" Sep 30 03:14:39 crc kubenswrapper[4744]: E0930 03:14:39.457454 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="sg-core" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457463 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="sg-core" Sep 30 03:14:39 crc kubenswrapper[4744]: E0930 03:14:39.457484 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-central-agent" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457491 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-central-agent" Sep 30 03:14:39 crc kubenswrapper[4744]: E0930 03:14:39.457512 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-notification-agent" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457518 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-notification-agent" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457687 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="sg-core" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457707 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="proxy-httpd" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457722 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-central-agent" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457735 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" containerName="ceilometer-notification-agent" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.457817 4744 scope.go:117] "RemoveContainer" containerID="ade752918b23978451e40c6578d21ad7a35ca56bb2fbe1cf4760f6ebe6fdc12e" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.474167 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.474284 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.476802 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.477032 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.480053 4744 scope.go:117] "RemoveContainer" containerID="35c26549da12c14a6703728df0b889038f271440bf251e6fb296fab25bb5bee6" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.515336 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93be78c9-96eb-4efb-9701-d2d8af3d38cd" path="/var/lib/kubelet/pods/93be78c9-96eb-4efb-9701-d2d8af3d38cd/volumes" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.588829 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-log-httpd\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.588877 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.588919 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-run-httpd\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.589013 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.589035 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-scripts\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.589081 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-config-data\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.589101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwfk5\" (UniqueName: \"kubernetes.io/projected/fbfa47fe-9e61-4f18-91d2-e6af1296f033-kube-api-access-bwfk5\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.690920 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-log-httpd\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.690969 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.690996 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-run-httpd\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.691080 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.691103 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-scripts\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.691136 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-config-data\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.691502 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-log-httpd\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.691528 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-run-httpd\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.691911 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwfk5\" (UniqueName: \"kubernetes.io/projected/fbfa47fe-9e61-4f18-91d2-e6af1296f033-kube-api-access-bwfk5\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.695648 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-scripts\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.696773 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.697140 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-config-data\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.698362 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.708944 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwfk5\" (UniqueName: \"kubernetes.io/projected/fbfa47fe-9e61-4f18-91d2-e6af1296f033-kube-api-access-bwfk5\") pod \"ceilometer-0\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " pod="openstack/ceilometer-0" Sep 30 03:14:39 crc kubenswrapper[4744]: I0930 03:14:39.799489 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:14:40 crc kubenswrapper[4744]: I0930 03:14:40.233973 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:14:40 crc kubenswrapper[4744]: W0930 03:14:40.253435 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbfa47fe_9e61_4f18_91d2_e6af1296f033.slice/crio-5ab0806f7c6431db3668c3af4015e967167ffc39ca8a55ec22cc4b85659e6eb1 WatchSource:0}: Error finding container 5ab0806f7c6431db3668c3af4015e967167ffc39ca8a55ec22cc4b85659e6eb1: Status 404 returned error can't find the container with id 5ab0806f7c6431db3668c3af4015e967167ffc39ca8a55ec22cc4b85659e6eb1 Sep 30 03:14:40 crc kubenswrapper[4744]: I0930 03:14:40.404605 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerStarted","Data":"5ab0806f7c6431db3668c3af4015e967167ffc39ca8a55ec22cc4b85659e6eb1"} Sep 30 03:14:41 crc kubenswrapper[4744]: I0930 03:14:41.423646 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerStarted","Data":"55d8be7f596fdf9f46e94e98397b057a6a8862e326f50d7221f6f7cdcbc5e986"} Sep 30 03:14:42 crc kubenswrapper[4744]: I0930 03:14:42.437175 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerStarted","Data":"9b93a3298be23f7ea43796c8d48523d96a63138e594b2576bfd24ee5e49d4a83"} Sep 30 03:14:42 crc kubenswrapper[4744]: I0930 03:14:42.437524 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerStarted","Data":"7e814f468532a2b9aae79387b6dad42116d6571f3d39ed7e788f0f09d3fdfc01"} Sep 30 03:14:44 crc kubenswrapper[4744]: I0930 03:14:44.462016 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerStarted","Data":"5d9de43c392f1dd77c2482e42938e5a0ff5c21049ca56a5073f48aa6162d9f11"} Sep 30 03:14:44 crc kubenswrapper[4744]: I0930 03:14:44.462664 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:14:44 crc kubenswrapper[4744]: I0930 03:14:44.488304 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.625621694 podStartE2EDuration="5.488283946s" podCreationTimestamp="2025-09-30 03:14:39 +0000 UTC" firstStartedPulling="2025-09-30 03:14:40.258233215 +0000 UTC m=+1207.431453199" lastFinishedPulling="2025-09-30 03:14:44.120895427 +0000 UTC m=+1211.294115451" observedRunningTime="2025-09-30 03:14:44.487886904 +0000 UTC m=+1211.661106868" watchObservedRunningTime="2025-09-30 03:14:44.488283946 +0000 UTC m=+1211.661503920" Sep 30 03:14:47 crc kubenswrapper[4744]: I0930 03:14:47.488957 4744 generic.go:334] "Generic (PLEG): container finished" podID="23e0e876-4407-4f01-8f41-31a30f8dbb93" containerID="3250bea0036aed0d33edbfb5fc8a9a3369cf28bc3db4c5e4149f5b741d8fd39a" exitCode=0 Sep 30 03:14:47 crc kubenswrapper[4744]: I0930 03:14:47.489099 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77vc8" event={"ID":"23e0e876-4407-4f01-8f41-31a30f8dbb93","Type":"ContainerDied","Data":"3250bea0036aed0d33edbfb5fc8a9a3369cf28bc3db4c5e4149f5b741d8fd39a"} Sep 30 03:14:48 crc kubenswrapper[4744]: I0930 03:14:48.939757 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.088672 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-scripts\") pod \"23e0e876-4407-4f01-8f41-31a30f8dbb93\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.089477 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-config-data\") pod \"23e0e876-4407-4f01-8f41-31a30f8dbb93\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.089686 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-combined-ca-bundle\") pod \"23e0e876-4407-4f01-8f41-31a30f8dbb93\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.089803 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhvct\" (UniqueName: \"kubernetes.io/projected/23e0e876-4407-4f01-8f41-31a30f8dbb93-kube-api-access-jhvct\") pod \"23e0e876-4407-4f01-8f41-31a30f8dbb93\" (UID: \"23e0e876-4407-4f01-8f41-31a30f8dbb93\") " Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.094550 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-scripts" (OuterVolumeSpecName: "scripts") pod "23e0e876-4407-4f01-8f41-31a30f8dbb93" (UID: "23e0e876-4407-4f01-8f41-31a30f8dbb93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.101455 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e0e876-4407-4f01-8f41-31a30f8dbb93-kube-api-access-jhvct" (OuterVolumeSpecName: "kube-api-access-jhvct") pod "23e0e876-4407-4f01-8f41-31a30f8dbb93" (UID: "23e0e876-4407-4f01-8f41-31a30f8dbb93"). InnerVolumeSpecName "kube-api-access-jhvct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.125457 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-config-data" (OuterVolumeSpecName: "config-data") pod "23e0e876-4407-4f01-8f41-31a30f8dbb93" (UID: "23e0e876-4407-4f01-8f41-31a30f8dbb93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.140027 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23e0e876-4407-4f01-8f41-31a30f8dbb93" (UID: "23e0e876-4407-4f01-8f41-31a30f8dbb93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.192671 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.192696 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.192705 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e0e876-4407-4f01-8f41-31a30f8dbb93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.192716 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhvct\" (UniqueName: \"kubernetes.io/projected/23e0e876-4407-4f01-8f41-31a30f8dbb93-kube-api-access-jhvct\") on node \"crc\" DevicePath \"\"" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.511344 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-77vc8" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.516911 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-77vc8" event={"ID":"23e0e876-4407-4f01-8f41-31a30f8dbb93","Type":"ContainerDied","Data":"cb64643b5a6e58f65c8c65cbdf1f12abadc291651d5be3bb1bf4e1c2990aafa6"} Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.516970 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb64643b5a6e58f65c8c65cbdf1f12abadc291651d5be3bb1bf4e1c2990aafa6" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.682461 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 03:14:49 crc kubenswrapper[4744]: E0930 03:14:49.683050 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e0e876-4407-4f01-8f41-31a30f8dbb93" containerName="nova-cell0-conductor-db-sync" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.683074 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e0e876-4407-4f01-8f41-31a30f8dbb93" containerName="nova-cell0-conductor-db-sync" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.683491 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e0e876-4407-4f01-8f41-31a30f8dbb93" containerName="nova-cell0-conductor-db-sync" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.684522 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.695792 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.726566 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.727629 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7xqr2" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.829362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.829460 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbwnr\" (UniqueName: \"kubernetes.io/projected/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-kube-api-access-hbwnr\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.829595 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.932520 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.933072 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbwnr\" (UniqueName: \"kubernetes.io/projected/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-kube-api-access-hbwnr\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.933316 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.940651 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.940900 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:49 crc kubenswrapper[4744]: I0930 03:14:49.968658 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbwnr\" (UniqueName: \"kubernetes.io/projected/9d37d81d-59fb-4686-b8b9-34ba95b98cb2-kube-api-access-hbwnr\") pod \"nova-cell0-conductor-0\" (UID: \"9d37d81d-59fb-4686-b8b9-34ba95b98cb2\") " pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:50 crc kubenswrapper[4744]: I0930 03:14:50.054079 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:50 crc kubenswrapper[4744]: I0930 03:14:50.524686 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 03:14:51 crc kubenswrapper[4744]: I0930 03:14:51.535938 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9d37d81d-59fb-4686-b8b9-34ba95b98cb2","Type":"ContainerStarted","Data":"00690918831f520f3e80cd99d19ae3617606291c919731cc90faa8837d17d76e"} Sep 30 03:14:51 crc kubenswrapper[4744]: I0930 03:14:51.536007 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9d37d81d-59fb-4686-b8b9-34ba95b98cb2","Type":"ContainerStarted","Data":"861d878e06b7936a7b1b236dd4c38051a682cacfe71b7878d45120821e1b815a"} Sep 30 03:14:51 crc kubenswrapper[4744]: I0930 03:14:51.536162 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:51 crc kubenswrapper[4744]: I0930 03:14:51.570731 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.570707065 podStartE2EDuration="2.570707065s" podCreationTimestamp="2025-09-30 03:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:14:51.558864367 +0000 UTC m=+1218.732084351" watchObservedRunningTime="2025-09-30 03:14:51.570707065 +0000 UTC m=+1218.743927079" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.101445 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.543360 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6jc7w"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.545856 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.550293 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.550555 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.562225 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6jc7w"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.661742 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-config-data\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.661954 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-scripts\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.662026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fbvc\" (UniqueName: \"kubernetes.io/projected/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-kube-api-access-5fbvc\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.662101 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.705501 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.707035 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.708516 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.721608 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.763510 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-config-data\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.763608 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-scripts\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.763642 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbvc\" (UniqueName: \"kubernetes.io/projected/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-kube-api-access-5fbvc\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.763677 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.775155 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-scripts\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.775282 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.779190 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-config-data\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.783041 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fbvc\" (UniqueName: \"kubernetes.io/projected/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-kube-api-access-5fbvc\") pod \"nova-cell0-cell-mapping-6jc7w\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.798632 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.809766 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.814690 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.826831 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.837180 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.838592 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.841971 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.860873 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.868535 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-config-data\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.868629 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.868675 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf25q\" (UniqueName: \"kubernetes.io/projected/264247e1-1195-4532-aafa-0eb43281e5b1-kube-api-access-pf25q\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.868881 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.903893 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.905484 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.915814 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.941955 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.980194 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/346dd188-63fa-4351-97ec-8e00be6cb731-logs\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.980450 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf25q\" (UniqueName: \"kubernetes.io/projected/264247e1-1195-4532-aafa-0eb43281e5b1-kube-api-access-pf25q\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.980590 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.980689 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c944c\" (UniqueName: \"kubernetes.io/projected/b7839fff-aa62-444b-9b89-07beed6dd699-kube-api-access-c944c\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.980811 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-config-data\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.980921 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4b4\" (UniqueName: \"kubernetes.io/projected/346dd188-63fa-4351-97ec-8e00be6cb731-kube-api-access-bt4b4\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.981006 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.981081 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7839fff-aa62-444b-9b89-07beed6dd699-logs\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.981153 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-config-data\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.981260 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.981351 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-config-data\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:55 crc kubenswrapper[4744]: I0930 03:14:55.985902 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-config-data\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:55.997989 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:55.998046 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-b86jr"] Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:55.999560 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.019439 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-b86jr"] Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.032989 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf25q\" (UniqueName: \"kubernetes.io/projected/264247e1-1195-4532-aafa-0eb43281e5b1-kube-api-access-pf25q\") pod \"nova-scheduler-0\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " pod="openstack/nova-scheduler-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.044034 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.086863 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087112 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctv8\" (UniqueName: \"kubernetes.io/projected/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-kube-api-access-dctv8\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087148 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087168 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087195 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-config-data\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087222 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/346dd188-63fa-4351-97ec-8e00be6cb731-logs\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087241 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p46d8\" (UniqueName: \"kubernetes.io/projected/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-kube-api-access-p46d8\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087289 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087305 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c944c\" (UniqueName: \"kubernetes.io/projected/b7839fff-aa62-444b-9b89-07beed6dd699-kube-api-access-c944c\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087326 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-svc\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087352 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087419 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4b4\" (UniqueName: \"kubernetes.io/projected/346dd188-63fa-4351-97ec-8e00be6cb731-kube-api-access-bt4b4\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087442 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-config\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087462 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087478 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087503 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7839fff-aa62-444b-9b89-07beed6dd699-logs\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087521 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-config-data\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.087902 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/346dd188-63fa-4351-97ec-8e00be6cb731-logs\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.088222 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7839fff-aa62-444b-9b89-07beed6dd699-logs\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.091352 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.092532 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-config-data\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.103730 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.108895 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4b4\" (UniqueName: \"kubernetes.io/projected/346dd188-63fa-4351-97ec-8e00be6cb731-kube-api-access-bt4b4\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.109001 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c944c\" (UniqueName: \"kubernetes.io/projected/b7839fff-aa62-444b-9b89-07beed6dd699-kube-api-access-c944c\") pod \"nova-metadata-0\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.116135 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-config-data\") pod \"nova-api-0\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188732 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-svc\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188780 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188836 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-config\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188853 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188909 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188928 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dctv8\" (UniqueName: \"kubernetes.io/projected/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-kube-api-access-dctv8\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188954 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.188974 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.189003 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p46d8\" (UniqueName: \"kubernetes.io/projected/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-kube-api-access-p46d8\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.190285 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-svc\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.191129 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.191683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-config\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.192150 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.193234 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.193606 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.194799 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.208144 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctv8\" (UniqueName: \"kubernetes.io/projected/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-kube-api-access-dctv8\") pod \"dnsmasq-dns-7d5fbbb8c5-b86jr\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.208346 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p46d8\" (UniqueName: \"kubernetes.io/projected/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-kube-api-access-p46d8\") pod \"nova-cell1-novncproxy-0\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.273974 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.334691 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.351910 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.362846 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.542789 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6jc7w"] Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.631637 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.663986 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpjh8"] Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.667109 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.671705 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.671883 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.674437 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpjh8"] Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.809596 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-scripts\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.809905 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.809961 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbz25\" (UniqueName: \"kubernetes.io/projected/98f23037-0380-4f1a-adc4-8ef59910c1f6-kube-api-access-gbz25\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.810029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-config-data\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.911404 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-scripts\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.911446 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.911502 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbz25\" (UniqueName: \"kubernetes.io/projected/98f23037-0380-4f1a-adc4-8ef59910c1f6-kube-api-access-gbz25\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.911730 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-config-data\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.918091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.922235 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-scripts\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.922891 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-config-data\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.935621 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbz25\" (UniqueName: \"kubernetes.io/projected/98f23037-0380-4f1a-adc4-8ef59910c1f6-kube-api-access-gbz25\") pod \"nova-cell1-conductor-db-sync-zpjh8\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:56 crc kubenswrapper[4744]: I0930 03:14:56.938984 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.010772 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.036772 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.234934 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-b86jr"] Sep 30 03:14:57 crc kubenswrapper[4744]: W0930 03:14:57.242986 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9a0bae2_8e99_4164_9b56_e7bdaa5cde65.slice/crio-3ba3040650919edc2ba71578f448fe8a5d160a216149898e3b646c9e988ee366 WatchSource:0}: Error finding container 3ba3040650919edc2ba71578f448fe8a5d160a216149898e3b646c9e988ee366: Status 404 returned error can't find the container with id 3ba3040650919edc2ba71578f448fe8a5d160a216149898e3b646c9e988ee366 Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.258097 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:14:57 crc kubenswrapper[4744]: W0930 03:14:57.264266 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb10cc4e5_8c8d_4012_b0ef_3c7beaea3374.slice/crio-6bf5cf6ba1c41efdfb6e3555c5b77113bb68f9a68579e56b53675add9d6a3a02 WatchSource:0}: Error finding container 6bf5cf6ba1c41efdfb6e3555c5b77113bb68f9a68579e56b53675add9d6a3a02: Status 404 returned error can't find the container with id 6bf5cf6ba1c41efdfb6e3555c5b77113bb68f9a68579e56b53675add9d6a3a02 Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.479938 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpjh8"] Sep 30 03:14:57 crc kubenswrapper[4744]: W0930 03:14:57.530058 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98f23037_0380_4f1a_adc4_8ef59910c1f6.slice/crio-54389abda10066c83501912c56a2077a0f17d7d387ca6e5e81e061d4f936a791 WatchSource:0}: Error finding container 54389abda10066c83501912c56a2077a0f17d7d387ca6e5e81e061d4f936a791: Status 404 returned error can't find the container with id 54389abda10066c83501912c56a2077a0f17d7d387ca6e5e81e061d4f936a791 Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.617775 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" event={"ID":"98f23037-0380-4f1a-adc4-8ef59910c1f6","Type":"ContainerStarted","Data":"54389abda10066c83501912c56a2077a0f17d7d387ca6e5e81e061d4f936a791"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.619307 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6jc7w" event={"ID":"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e","Type":"ContainerStarted","Data":"356fb22a7828dc96b7a56c602cb2a3a32fc45c74aad761c67b031451a0dde60f"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.619350 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6jc7w" event={"ID":"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e","Type":"ContainerStarted","Data":"0cd0d73ea397c285f93afdd7032c8cec902b2f9ef414ffb845cb0fd7db1af8d9"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.621620 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374","Type":"ContainerStarted","Data":"6bf5cf6ba1c41efdfb6e3555c5b77113bb68f9a68579e56b53675add9d6a3a02"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.623619 4744 generic.go:334] "Generic (PLEG): container finished" podID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerID="ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956" exitCode=0 Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.623754 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" event={"ID":"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65","Type":"ContainerDied","Data":"ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.623791 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" event={"ID":"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65","Type":"ContainerStarted","Data":"3ba3040650919edc2ba71578f448fe8a5d160a216149898e3b646c9e988ee366"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.625763 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"264247e1-1195-4532-aafa-0eb43281e5b1","Type":"ContainerStarted","Data":"32780447bc2d1b203d2a553040cf655970afaf84fae0e9bc72a494f563023f5b"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.628804 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"346dd188-63fa-4351-97ec-8e00be6cb731","Type":"ContainerStarted","Data":"07268e825bd6dd21bd9e7930b1f132caf8a3f4644cf56be8ced3a846099156be"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.630813 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7839fff-aa62-444b-9b89-07beed6dd699","Type":"ContainerStarted","Data":"664b320cf17e5f5211413e7dfd36e11044c42cf3f441feefc9fe61b3662b4c21"} Sep 30 03:14:57 crc kubenswrapper[4744]: I0930 03:14:57.646903 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6jc7w" podStartSLOduration=2.646883674 podStartE2EDuration="2.646883674s" podCreationTimestamp="2025-09-30 03:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:14:57.634407206 +0000 UTC m=+1224.807627170" watchObservedRunningTime="2025-09-30 03:14:57.646883674 +0000 UTC m=+1224.820103648" Sep 30 03:14:58 crc kubenswrapper[4744]: I0930 03:14:58.644041 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" event={"ID":"98f23037-0380-4f1a-adc4-8ef59910c1f6","Type":"ContainerStarted","Data":"70b57b054706f2fc74bd01ff06ef4f9fccfd73207ffdca04da3a240618336cb0"} Sep 30 03:14:58 crc kubenswrapper[4744]: I0930 03:14:58.663820 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" podStartSLOduration=2.663800504 podStartE2EDuration="2.663800504s" podCreationTimestamp="2025-09-30 03:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:14:58.659730668 +0000 UTC m=+1225.832950662" watchObservedRunningTime="2025-09-30 03:14:58.663800504 +0000 UTC m=+1225.837020478" Sep 30 03:14:59 crc kubenswrapper[4744]: I0930 03:14:59.552155 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:14:59 crc kubenswrapper[4744]: I0930 03:14:59.562994 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.133557 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2"] Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.135273 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.137981 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.138804 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.143080 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2"] Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.191697 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c155cc9-f717-47c3-924e-5fbb08c82456-config-volume\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.191752 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c155cc9-f717-47c3-924e-5fbb08c82456-secret-volume\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.191934 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjqd\" (UniqueName: \"kubernetes.io/projected/7c155cc9-f717-47c3-924e-5fbb08c82456-kube-api-access-ntjqd\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.293726 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjqd\" (UniqueName: \"kubernetes.io/projected/7c155cc9-f717-47c3-924e-5fbb08c82456-kube-api-access-ntjqd\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.293839 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c155cc9-f717-47c3-924e-5fbb08c82456-config-volume\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.293862 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c155cc9-f717-47c3-924e-5fbb08c82456-secret-volume\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.295318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c155cc9-f717-47c3-924e-5fbb08c82456-config-volume\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.300406 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c155cc9-f717-47c3-924e-5fbb08c82456-secret-volume\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.310394 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjqd\" (UniqueName: \"kubernetes.io/projected/7c155cc9-f717-47c3-924e-5fbb08c82456-kube-api-access-ntjqd\") pod \"collect-profiles-29320035-2h8r2\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.466025 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.686827 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"264247e1-1195-4532-aafa-0eb43281e5b1","Type":"ContainerStarted","Data":"678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db"} Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.691221 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"346dd188-63fa-4351-97ec-8e00be6cb731","Type":"ContainerStarted","Data":"704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738"} Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.691246 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"346dd188-63fa-4351-97ec-8e00be6cb731","Type":"ContainerStarted","Data":"07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025"} Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.693051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7839fff-aa62-444b-9b89-07beed6dd699","Type":"ContainerStarted","Data":"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1"} Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.693094 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7839fff-aa62-444b-9b89-07beed6dd699","Type":"ContainerStarted","Data":"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b"} Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.693210 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-log" containerID="cri-o://e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b" gracePeriod=30 Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.693465 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-metadata" containerID="cri-o://f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1" gracePeriod=30 Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.706481 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374","Type":"ContainerStarted","Data":"2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba"} Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.706809 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba" gracePeriod=30 Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.713492 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9863580560000003 podStartE2EDuration="5.713467377s" podCreationTimestamp="2025-09-30 03:14:55 +0000 UTC" firstStartedPulling="2025-09-30 03:14:56.65695021 +0000 UTC m=+1223.830170184" lastFinishedPulling="2025-09-30 03:14:59.384059531 +0000 UTC m=+1226.557279505" observedRunningTime="2025-09-30 03:15:00.703142327 +0000 UTC m=+1227.876362321" watchObservedRunningTime="2025-09-30 03:15:00.713467377 +0000 UTC m=+1227.886687351" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.715310 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" event={"ID":"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65","Type":"ContainerStarted","Data":"b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d"} Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.716707 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.724856 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.391128905 podStartE2EDuration="5.72483923s" podCreationTimestamp="2025-09-30 03:14:55 +0000 UTC" firstStartedPulling="2025-09-30 03:14:57.050356786 +0000 UTC m=+1224.223576760" lastFinishedPulling="2025-09-30 03:14:59.384067071 +0000 UTC m=+1226.557287085" observedRunningTime="2025-09-30 03:15:00.723849479 +0000 UTC m=+1227.897069453" watchObservedRunningTime="2025-09-30 03:15:00.72483923 +0000 UTC m=+1227.898059204" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.747197 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.329920826 podStartE2EDuration="5.747179443s" podCreationTimestamp="2025-09-30 03:14:55 +0000 UTC" firstStartedPulling="2025-09-30 03:14:56.969167347 +0000 UTC m=+1224.142387321" lastFinishedPulling="2025-09-30 03:14:59.386425954 +0000 UTC m=+1226.559645938" observedRunningTime="2025-09-30 03:15:00.742077224 +0000 UTC m=+1227.915297218" watchObservedRunningTime="2025-09-30 03:15:00.747179443 +0000 UTC m=+1227.920399417" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.772063 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.665404488 podStartE2EDuration="5.772045595s" podCreationTimestamp="2025-09-30 03:14:55 +0000 UTC" firstStartedPulling="2025-09-30 03:14:57.266218223 +0000 UTC m=+1224.439438197" lastFinishedPulling="2025-09-30 03:15:00.37285933 +0000 UTC m=+1227.546079304" observedRunningTime="2025-09-30 03:15:00.761569739 +0000 UTC m=+1227.934789713" watchObservedRunningTime="2025-09-30 03:15:00.772045595 +0000 UTC m=+1227.945265569" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.785640 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" podStartSLOduration=5.785598415 podStartE2EDuration="5.785598415s" podCreationTimestamp="2025-09-30 03:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:00.783776958 +0000 UTC m=+1227.956996932" watchObservedRunningTime="2025-09-30 03:15:00.785598415 +0000 UTC m=+1227.958818389" Sep 30 03:15:00 crc kubenswrapper[4744]: I0930 03:15:00.944224 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2"] Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.044275 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.335854 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.335925 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.352595 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.644146 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.725450 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7839fff-aa62-444b-9b89-07beed6dd699" containerID="f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1" exitCode=0 Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.725481 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7839fff-aa62-444b-9b89-07beed6dd699" containerID="e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b" exitCode=143 Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.725524 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.725540 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7839fff-aa62-444b-9b89-07beed6dd699","Type":"ContainerDied","Data":"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1"} Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.725586 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7839fff-aa62-444b-9b89-07beed6dd699","Type":"ContainerDied","Data":"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b"} Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.725627 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b7839fff-aa62-444b-9b89-07beed6dd699","Type":"ContainerDied","Data":"664b320cf17e5f5211413e7dfd36e11044c42cf3f441feefc9fe61b3662b4c21"} Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.725612 4744 scope.go:117] "RemoveContainer" containerID="f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.726646 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c944c\" (UniqueName: \"kubernetes.io/projected/b7839fff-aa62-444b-9b89-07beed6dd699-kube-api-access-c944c\") pod \"b7839fff-aa62-444b-9b89-07beed6dd699\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.726985 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-config-data\") pod \"b7839fff-aa62-444b-9b89-07beed6dd699\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.727107 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7839fff-aa62-444b-9b89-07beed6dd699-logs\") pod \"b7839fff-aa62-444b-9b89-07beed6dd699\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.727945 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-combined-ca-bundle\") pod \"b7839fff-aa62-444b-9b89-07beed6dd699\" (UID: \"b7839fff-aa62-444b-9b89-07beed6dd699\") " Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.728217 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7839fff-aa62-444b-9b89-07beed6dd699-logs" (OuterVolumeSpecName: "logs") pod "b7839fff-aa62-444b-9b89-07beed6dd699" (UID: "b7839fff-aa62-444b-9b89-07beed6dd699"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.728677 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7839fff-aa62-444b-9b89-07beed6dd699-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.732998 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7839fff-aa62-444b-9b89-07beed6dd699-kube-api-access-c944c" (OuterVolumeSpecName: "kube-api-access-c944c") pod "b7839fff-aa62-444b-9b89-07beed6dd699" (UID: "b7839fff-aa62-444b-9b89-07beed6dd699"). InnerVolumeSpecName "kube-api-access-c944c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.743784 4744 generic.go:334] "Generic (PLEG): container finished" podID="7c155cc9-f717-47c3-924e-5fbb08c82456" containerID="8f2cdf7dd7d721d6924555a7ea45c66f1015a9330a70a71342fbd1cbb353453a" exitCode=0 Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.745360 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" event={"ID":"7c155cc9-f717-47c3-924e-5fbb08c82456","Type":"ContainerDied","Data":"8f2cdf7dd7d721d6924555a7ea45c66f1015a9330a70a71342fbd1cbb353453a"} Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.745520 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" event={"ID":"7c155cc9-f717-47c3-924e-5fbb08c82456","Type":"ContainerStarted","Data":"6aa68bef9d797f7ed2033375ea9cf053c20cae18b0e5d15597c164382c1b78ba"} Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.771519 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7839fff-aa62-444b-9b89-07beed6dd699" (UID: "b7839fff-aa62-444b-9b89-07beed6dd699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.793528 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-config-data" (OuterVolumeSpecName: "config-data") pod "b7839fff-aa62-444b-9b89-07beed6dd699" (UID: "b7839fff-aa62-444b-9b89-07beed6dd699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.830364 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.830410 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7839fff-aa62-444b-9b89-07beed6dd699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.830421 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c944c\" (UniqueName: \"kubernetes.io/projected/b7839fff-aa62-444b-9b89-07beed6dd699-kube-api-access-c944c\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.845189 4744 scope.go:117] "RemoveContainer" containerID="e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.875407 4744 scope.go:117] "RemoveContainer" containerID="f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1" Sep 30 03:15:01 crc kubenswrapper[4744]: E0930 03:15:01.876394 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1\": container with ID starting with f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1 not found: ID does not exist" containerID="f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.876451 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1"} err="failed to get container status \"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1\": rpc error: code = NotFound desc = could not find container \"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1\": container with ID starting with f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1 not found: ID does not exist" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.876475 4744 scope.go:117] "RemoveContainer" containerID="e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b" Sep 30 03:15:01 crc kubenswrapper[4744]: E0930 03:15:01.876805 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b\": container with ID starting with e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b not found: ID does not exist" containerID="e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.876828 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b"} err="failed to get container status \"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b\": rpc error: code = NotFound desc = could not find container \"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b\": container with ID starting with e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b not found: ID does not exist" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.876843 4744 scope.go:117] "RemoveContainer" containerID="f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.876999 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1"} err="failed to get container status \"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1\": rpc error: code = NotFound desc = could not find container \"f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1\": container with ID starting with f357b574886ebadbfdf20c17ddb32a54eab07064dcf833ba66289f33083f93a1 not found: ID does not exist" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.877011 4744 scope.go:117] "RemoveContainer" containerID="e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b" Sep 30 03:15:01 crc kubenswrapper[4744]: I0930 03:15:01.877178 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b"} err="failed to get container status \"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b\": rpc error: code = NotFound desc = could not find container \"e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b\": container with ID starting with e39ff7d4929b07a885be68f8c024772fc710b7afa1b603a3004fbe5cb736e78b not found: ID does not exist" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.070136 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.085658 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.120861 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:02 crc kubenswrapper[4744]: E0930 03:15:02.122172 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-metadata" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.122207 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-metadata" Sep 30 03:15:02 crc kubenswrapper[4744]: E0930 03:15:02.122235 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-log" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.122249 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-log" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.124139 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-log" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.124180 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" containerName="nova-metadata-metadata" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.135299 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.139543 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.140063 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.191029 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.249158 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.249230 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-config-data\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.249315 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.249478 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4j9r\" (UniqueName: \"kubernetes.io/projected/04775e5c-eb80-4810-b804-03282f73d19e-kube-api-access-p4j9r\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.249676 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04775e5c-eb80-4810-b804-03282f73d19e-logs\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.351658 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4j9r\" (UniqueName: \"kubernetes.io/projected/04775e5c-eb80-4810-b804-03282f73d19e-kube-api-access-p4j9r\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.351832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04775e5c-eb80-4810-b804-03282f73d19e-logs\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.351981 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.352038 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-config-data\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.352111 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.352528 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04775e5c-eb80-4810-b804-03282f73d19e-logs\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.357029 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-config-data\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.357169 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.357677 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.376056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4j9r\" (UniqueName: \"kubernetes.io/projected/04775e5c-eb80-4810-b804-03282f73d19e-kube-api-access-p4j9r\") pod \"nova-metadata-0\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.468106 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:02 crc kubenswrapper[4744]: I0930 03:15:02.932407 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:02 crc kubenswrapper[4744]: W0930 03:15:02.969125 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04775e5c_eb80_4810_b804_03282f73d19e.slice/crio-59be505abb90efcd1074241b5d7b855f327e298335706ec62362a194f053c0aa WatchSource:0}: Error finding container 59be505abb90efcd1074241b5d7b855f327e298335706ec62362a194f053c0aa: Status 404 returned error can't find the container with id 59be505abb90efcd1074241b5d7b855f327e298335706ec62362a194f053c0aa Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.110675 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.169207 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c155cc9-f717-47c3-924e-5fbb08c82456-config-volume\") pod \"7c155cc9-f717-47c3-924e-5fbb08c82456\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.169289 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c155cc9-f717-47c3-924e-5fbb08c82456-secret-volume\") pod \"7c155cc9-f717-47c3-924e-5fbb08c82456\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.169363 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntjqd\" (UniqueName: \"kubernetes.io/projected/7c155cc9-f717-47c3-924e-5fbb08c82456-kube-api-access-ntjqd\") pod \"7c155cc9-f717-47c3-924e-5fbb08c82456\" (UID: \"7c155cc9-f717-47c3-924e-5fbb08c82456\") " Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.170236 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c155cc9-f717-47c3-924e-5fbb08c82456-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c155cc9-f717-47c3-924e-5fbb08c82456" (UID: "7c155cc9-f717-47c3-924e-5fbb08c82456"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.170631 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c155cc9-f717-47c3-924e-5fbb08c82456-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.176272 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c155cc9-f717-47c3-924e-5fbb08c82456-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c155cc9-f717-47c3-924e-5fbb08c82456" (UID: "7c155cc9-f717-47c3-924e-5fbb08c82456"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.180909 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c155cc9-f717-47c3-924e-5fbb08c82456-kube-api-access-ntjqd" (OuterVolumeSpecName: "kube-api-access-ntjqd") pod "7c155cc9-f717-47c3-924e-5fbb08c82456" (UID: "7c155cc9-f717-47c3-924e-5fbb08c82456"). InnerVolumeSpecName "kube-api-access-ntjqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.273029 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c155cc9-f717-47c3-924e-5fbb08c82456-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.273082 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntjqd\" (UniqueName: \"kubernetes.io/projected/7c155cc9-f717-47c3-924e-5fbb08c82456-kube-api-access-ntjqd\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.530038 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7839fff-aa62-444b-9b89-07beed6dd699" path="/var/lib/kubelet/pods/b7839fff-aa62-444b-9b89-07beed6dd699/volumes" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.770132 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04775e5c-eb80-4810-b804-03282f73d19e","Type":"ContainerStarted","Data":"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b"} Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.770423 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04775e5c-eb80-4810-b804-03282f73d19e","Type":"ContainerStarted","Data":"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26"} Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.770486 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04775e5c-eb80-4810-b804-03282f73d19e","Type":"ContainerStarted","Data":"59be505abb90efcd1074241b5d7b855f327e298335706ec62362a194f053c0aa"} Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.771881 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" event={"ID":"7c155cc9-f717-47c3-924e-5fbb08c82456","Type":"ContainerDied","Data":"6aa68bef9d797f7ed2033375ea9cf053c20cae18b0e5d15597c164382c1b78ba"} Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.771912 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa68bef9d797f7ed2033375ea9cf053c20cae18b0e5d15597c164382c1b78ba" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.771961 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2" Sep 30 03:15:03 crc kubenswrapper[4744]: I0930 03:15:03.791162 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.791143254 podStartE2EDuration="1.791143254s" podCreationTimestamp="2025-09-30 03:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:03.786341356 +0000 UTC m=+1230.959561330" watchObservedRunningTime="2025-09-30 03:15:03.791143254 +0000 UTC m=+1230.964363228" Sep 30 03:15:04 crc kubenswrapper[4744]: I0930 03:15:04.802662 4744 generic.go:334] "Generic (PLEG): container finished" podID="98f23037-0380-4f1a-adc4-8ef59910c1f6" containerID="70b57b054706f2fc74bd01ff06ef4f9fccfd73207ffdca04da3a240618336cb0" exitCode=0 Sep 30 03:15:04 crc kubenswrapper[4744]: I0930 03:15:04.802758 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" event={"ID":"98f23037-0380-4f1a-adc4-8ef59910c1f6","Type":"ContainerDied","Data":"70b57b054706f2fc74bd01ff06ef4f9fccfd73207ffdca04da3a240618336cb0"} Sep 30 03:15:04 crc kubenswrapper[4744]: I0930 03:15:04.806406 4744 generic.go:334] "Generic (PLEG): container finished" podID="cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" containerID="356fb22a7828dc96b7a56c602cb2a3a32fc45c74aad761c67b031451a0dde60f" exitCode=0 Sep 30 03:15:04 crc kubenswrapper[4744]: I0930 03:15:04.806856 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6jc7w" event={"ID":"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e","Type":"ContainerDied","Data":"356fb22a7828dc96b7a56c602cb2a3a32fc45c74aad761c67b031451a0dde60f"} Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.044908 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.104466 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.275509 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.275595 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.331224 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.336474 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.375590 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.443436 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-s4m4j"] Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.443639 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" podUID="409d892b-c223-43a8-9550-ff50fb759e2b" containerName="dnsmasq-dns" containerID="cri-o://b51489a87cb59a83deb39652c5d9471a460958b788d0c6b660b75d1ed64a725c" gracePeriod=10 Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543063 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-combined-ca-bundle\") pod \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fbvc\" (UniqueName: \"kubernetes.io/projected/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-kube-api-access-5fbvc\") pod \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543200 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-scripts\") pod \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543250 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-config-data\") pod \"98f23037-0380-4f1a-adc4-8ef59910c1f6\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543280 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbz25\" (UniqueName: \"kubernetes.io/projected/98f23037-0380-4f1a-adc4-8ef59910c1f6-kube-api-access-gbz25\") pod \"98f23037-0380-4f1a-adc4-8ef59910c1f6\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543312 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-combined-ca-bundle\") pod \"98f23037-0380-4f1a-adc4-8ef59910c1f6\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543404 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-config-data\") pod \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\" (UID: \"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.543429 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-scripts\") pod \"98f23037-0380-4f1a-adc4-8ef59910c1f6\" (UID: \"98f23037-0380-4f1a-adc4-8ef59910c1f6\") " Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.573957 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-scripts" (OuterVolumeSpecName: "scripts") pod "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" (UID: "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.574163 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-scripts" (OuterVolumeSpecName: "scripts") pod "98f23037-0380-4f1a-adc4-8ef59910c1f6" (UID: "98f23037-0380-4f1a-adc4-8ef59910c1f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.585986 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-kube-api-access-5fbvc" (OuterVolumeSpecName: "kube-api-access-5fbvc") pod "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" (UID: "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e"). InnerVolumeSpecName "kube-api-access-5fbvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.608614 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f23037-0380-4f1a-adc4-8ef59910c1f6-kube-api-access-gbz25" (OuterVolumeSpecName: "kube-api-access-gbz25") pod "98f23037-0380-4f1a-adc4-8ef59910c1f6" (UID: "98f23037-0380-4f1a-adc4-8ef59910c1f6"). InnerVolumeSpecName "kube-api-access-gbz25". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.629441 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" (UID: "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.649466 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.649497 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.649507 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fbvc\" (UniqueName: \"kubernetes.io/projected/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-kube-api-access-5fbvc\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.649515 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.649524 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbz25\" (UniqueName: \"kubernetes.io/projected/98f23037-0380-4f1a-adc4-8ef59910c1f6-kube-api-access-gbz25\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.671259 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-config-data" (OuterVolumeSpecName: "config-data") pod "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" (UID: "cbfbe619-9f32-4938-ab7c-e32cdbbdf94e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.672050 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98f23037-0380-4f1a-adc4-8ef59910c1f6" (UID: "98f23037-0380-4f1a-adc4-8ef59910c1f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.672608 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-config-data" (OuterVolumeSpecName: "config-data") pod "98f23037-0380-4f1a-adc4-8ef59910c1f6" (UID: "98f23037-0380-4f1a-adc4-8ef59910c1f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.752632 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.752663 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.752673 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f23037-0380-4f1a-adc4-8ef59910c1f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.831592 4744 generic.go:334] "Generic (PLEG): container finished" podID="409d892b-c223-43a8-9550-ff50fb759e2b" containerID="b51489a87cb59a83deb39652c5d9471a460958b788d0c6b660b75d1ed64a725c" exitCode=0 Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.831665 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" event={"ID":"409d892b-c223-43a8-9550-ff50fb759e2b","Type":"ContainerDied","Data":"b51489a87cb59a83deb39652c5d9471a460958b788d0c6b660b75d1ed64a725c"} Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.833098 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" event={"ID":"98f23037-0380-4f1a-adc4-8ef59910c1f6","Type":"ContainerDied","Data":"54389abda10066c83501912c56a2077a0f17d7d387ca6e5e81e061d4f936a791"} Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.833121 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54389abda10066c83501912c56a2077a0f17d7d387ca6e5e81e061d4f936a791" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.833176 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpjh8" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.835797 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6jc7w" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.835839 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6jc7w" event={"ID":"cbfbe619-9f32-4938-ab7c-e32cdbbdf94e","Type":"ContainerDied","Data":"0cd0d73ea397c285f93afdd7032c8cec902b2f9ef414ffb845cb0fd7db1af8d9"} Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.835867 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cd0d73ea397c285f93afdd7032c8cec902b2f9ef414ffb845cb0fd7db1af8d9" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.911657 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.922956 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 03:15:06 crc kubenswrapper[4744]: E0930 03:15:06.923473 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" containerName="nova-manage" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.923493 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" containerName="nova-manage" Sep 30 03:15:06 crc kubenswrapper[4744]: E0930 03:15:06.923513 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c155cc9-f717-47c3-924e-5fbb08c82456" containerName="collect-profiles" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.923521 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c155cc9-f717-47c3-924e-5fbb08c82456" containerName="collect-profiles" Sep 30 03:15:06 crc kubenswrapper[4744]: E0930 03:15:06.923540 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f23037-0380-4f1a-adc4-8ef59910c1f6" containerName="nova-cell1-conductor-db-sync" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.923548 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f23037-0380-4f1a-adc4-8ef59910c1f6" containerName="nova-cell1-conductor-db-sync" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.923751 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c155cc9-f717-47c3-924e-5fbb08c82456" containerName="collect-profiles" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.923772 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" containerName="nova-manage" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.923783 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f23037-0380-4f1a-adc4-8ef59910c1f6" containerName="nova-cell1-conductor-db-sync" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.924523 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.927765 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.934492 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 03:15:06 crc kubenswrapper[4744]: I0930 03:15:06.999322 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.019426 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.019628 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-log" containerID="cri-o://07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025" gracePeriod=30 Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.019699 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-api" containerID="cri-o://704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738" gracePeriod=30 Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.028434 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": EOF" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.028438 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": EOF" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.050109 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.050306 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-log" containerID="cri-o://8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26" gracePeriod=30 Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.050709 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-metadata" containerID="cri-o://a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b" gracePeriod=30 Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.064087 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c58372-9d54-41ad-8059-5666ff3ab3c6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.064278 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn29j\" (UniqueName: \"kubernetes.io/projected/45c58372-9d54-41ad-8059-5666ff3ab3c6-kube-api-access-kn29j\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.064349 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c58372-9d54-41ad-8059-5666ff3ab3c6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.165639 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-swift-storage-0\") pod \"409d892b-c223-43a8-9550-ff50fb759e2b\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.165742 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-nb\") pod \"409d892b-c223-43a8-9550-ff50fb759e2b\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.165787 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-svc\") pod \"409d892b-c223-43a8-9550-ff50fb759e2b\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.165906 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-sb\") pod \"409d892b-c223-43a8-9550-ff50fb759e2b\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.165972 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs2ns\" (UniqueName: \"kubernetes.io/projected/409d892b-c223-43a8-9550-ff50fb759e2b-kube-api-access-fs2ns\") pod \"409d892b-c223-43a8-9550-ff50fb759e2b\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.166034 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-config\") pod \"409d892b-c223-43a8-9550-ff50fb759e2b\" (UID: \"409d892b-c223-43a8-9550-ff50fb759e2b\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.166468 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c58372-9d54-41ad-8059-5666ff3ab3c6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.166532 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn29j\" (UniqueName: \"kubernetes.io/projected/45c58372-9d54-41ad-8059-5666ff3ab3c6-kube-api-access-kn29j\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.166570 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c58372-9d54-41ad-8059-5666ff3ab3c6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.171335 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c58372-9d54-41ad-8059-5666ff3ab3c6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.178068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c58372-9d54-41ad-8059-5666ff3ab3c6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.180540 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/409d892b-c223-43a8-9550-ff50fb759e2b-kube-api-access-fs2ns" (OuterVolumeSpecName: "kube-api-access-fs2ns") pod "409d892b-c223-43a8-9550-ff50fb759e2b" (UID: "409d892b-c223-43a8-9550-ff50fb759e2b"). InnerVolumeSpecName "kube-api-access-fs2ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.190453 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn29j\" (UniqueName: \"kubernetes.io/projected/45c58372-9d54-41ad-8059-5666ff3ab3c6-kube-api-access-kn29j\") pod \"nova-cell1-conductor-0\" (UID: \"45c58372-9d54-41ad-8059-5666ff3ab3c6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.229651 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "409d892b-c223-43a8-9550-ff50fb759e2b" (UID: "409d892b-c223-43a8-9550-ff50fb759e2b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.231955 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "409d892b-c223-43a8-9550-ff50fb759e2b" (UID: "409d892b-c223-43a8-9550-ff50fb759e2b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.238708 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "409d892b-c223-43a8-9550-ff50fb759e2b" (UID: "409d892b-c223-43a8-9550-ff50fb759e2b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.245760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "409d892b-c223-43a8-9550-ff50fb759e2b" (UID: "409d892b-c223-43a8-9550-ff50fb759e2b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.247448 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-config" (OuterVolumeSpecName: "config") pod "409d892b-c223-43a8-9550-ff50fb759e2b" (UID: "409d892b-c223-43a8-9550-ff50fb759e2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.249158 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.270454 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.270658 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs2ns\" (UniqueName: \"kubernetes.io/projected/409d892b-c223-43a8-9550-ff50fb759e2b-kube-api-access-fs2ns\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.270745 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.270813 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.270876 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.270939 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/409d892b-c223-43a8-9550-ff50fb759e2b-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.436423 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.470216 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.470283 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.645400 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.783149 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.786938 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04775e5c-eb80-4810-b804-03282f73d19e-logs\") pod \"04775e5c-eb80-4810-b804-03282f73d19e\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.787030 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-combined-ca-bundle\") pod \"04775e5c-eb80-4810-b804-03282f73d19e\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.787147 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-config-data\") pod \"04775e5c-eb80-4810-b804-03282f73d19e\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.787207 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-nova-metadata-tls-certs\") pod \"04775e5c-eb80-4810-b804-03282f73d19e\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.787282 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4j9r\" (UniqueName: \"kubernetes.io/projected/04775e5c-eb80-4810-b804-03282f73d19e-kube-api-access-p4j9r\") pod \"04775e5c-eb80-4810-b804-03282f73d19e\" (UID: \"04775e5c-eb80-4810-b804-03282f73d19e\") " Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.788932 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04775e5c-eb80-4810-b804-03282f73d19e-logs" (OuterVolumeSpecName: "logs") pod "04775e5c-eb80-4810-b804-03282f73d19e" (UID: "04775e5c-eb80-4810-b804-03282f73d19e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.792927 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04775e5c-eb80-4810-b804-03282f73d19e-kube-api-access-p4j9r" (OuterVolumeSpecName: "kube-api-access-p4j9r") pod "04775e5c-eb80-4810-b804-03282f73d19e" (UID: "04775e5c-eb80-4810-b804-03282f73d19e"). InnerVolumeSpecName "kube-api-access-p4j9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: W0930 03:15:07.793338 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c58372_9d54_41ad_8059_5666ff3ab3c6.slice/crio-61b43c9736ead4ba15d28de3dc29d713025e390ac81bb4e8af31b46d10e4854a WatchSource:0}: Error finding container 61b43c9736ead4ba15d28de3dc29d713025e390ac81bb4e8af31b46d10e4854a: Status 404 returned error can't find the container with id 61b43c9736ead4ba15d28de3dc29d713025e390ac81bb4e8af31b46d10e4854a Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.815068 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04775e5c-eb80-4810-b804-03282f73d19e" (UID: "04775e5c-eb80-4810-b804-03282f73d19e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.830977 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-config-data" (OuterVolumeSpecName: "config-data") pod "04775e5c-eb80-4810-b804-03282f73d19e" (UID: "04775e5c-eb80-4810-b804-03282f73d19e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.846200 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" event={"ID":"409d892b-c223-43a8-9550-ff50fb759e2b","Type":"ContainerDied","Data":"d04f89a472c0351f699d86f9936d5cfd55a1c2eb1af7636ce135b1d5db42ac2b"} Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.846506 4744 scope.go:117] "RemoveContainer" containerID="b51489a87cb59a83deb39652c5d9471a460958b788d0c6b660b75d1ed64a725c" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.846288 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-s4m4j" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.848141 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"45c58372-9d54-41ad-8059-5666ff3ab3c6","Type":"ContainerStarted","Data":"61b43c9736ead4ba15d28de3dc29d713025e390ac81bb4e8af31b46d10e4854a"} Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.850780 4744 generic.go:334] "Generic (PLEG): container finished" podID="346dd188-63fa-4351-97ec-8e00be6cb731" containerID="07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025" exitCode=143 Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.850833 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"346dd188-63fa-4351-97ec-8e00be6cb731","Type":"ContainerDied","Data":"07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025"} Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.851806 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "04775e5c-eb80-4810-b804-03282f73d19e" (UID: "04775e5c-eb80-4810-b804-03282f73d19e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.852991 4744 generic.go:334] "Generic (PLEG): container finished" podID="04775e5c-eb80-4810-b804-03282f73d19e" containerID="a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b" exitCode=0 Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.853004 4744 generic.go:334] "Generic (PLEG): container finished" podID="04775e5c-eb80-4810-b804-03282f73d19e" containerID="8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26" exitCode=143 Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.853104 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.853248 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04775e5c-eb80-4810-b804-03282f73d19e","Type":"ContainerDied","Data":"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b"} Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.853296 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04775e5c-eb80-4810-b804-03282f73d19e","Type":"ContainerDied","Data":"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26"} Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.853324 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04775e5c-eb80-4810-b804-03282f73d19e","Type":"ContainerDied","Data":"59be505abb90efcd1074241b5d7b855f327e298335706ec62362a194f053c0aa"} Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.876478 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-s4m4j"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.884394 4744 scope.go:117] "RemoveContainer" containerID="0d1a2eb3851b60a0e03db74addc8f9de94facfb9132f4748e75576488014d6b4" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.885617 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-s4m4j"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.889854 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.889957 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.890046 4744 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04775e5c-eb80-4810-b804-03282f73d19e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.890127 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4j9r\" (UniqueName: \"kubernetes.io/projected/04775e5c-eb80-4810-b804-03282f73d19e-kube-api-access-p4j9r\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.890206 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04775e5c-eb80-4810-b804-03282f73d19e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.911828 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.932232 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.944700 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: E0930 03:15:07.945133 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-log" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.945150 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-log" Sep 30 03:15:07 crc kubenswrapper[4744]: E0930 03:15:07.945159 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-metadata" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.945166 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-metadata" Sep 30 03:15:07 crc kubenswrapper[4744]: E0930 03:15:07.945181 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409d892b-c223-43a8-9550-ff50fb759e2b" containerName="init" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.945187 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="409d892b-c223-43a8-9550-ff50fb759e2b" containerName="init" Sep 30 03:15:07 crc kubenswrapper[4744]: E0930 03:15:07.945220 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409d892b-c223-43a8-9550-ff50fb759e2b" containerName="dnsmasq-dns" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.945228 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="409d892b-c223-43a8-9550-ff50fb759e2b" containerName="dnsmasq-dns" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.945417 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="409d892b-c223-43a8-9550-ff50fb759e2b" containerName="dnsmasq-dns" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.945438 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-log" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.945449 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="04775e5c-eb80-4810-b804-03282f73d19e" containerName="nova-metadata-metadata" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.946874 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.947037 4744 scope.go:117] "RemoveContainer" containerID="a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.948191 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.952164 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.952601 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 03:15:07 crc kubenswrapper[4744]: I0930 03:15:07.975065 4744 scope.go:117] "RemoveContainer" containerID="8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.011402 4744 scope.go:117] "RemoveContainer" containerID="a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b" Sep 30 03:15:08 crc kubenswrapper[4744]: E0930 03:15:08.011709 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b\": container with ID starting with a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b not found: ID does not exist" containerID="a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.011736 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b"} err="failed to get container status \"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b\": rpc error: code = NotFound desc = could not find container \"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b\": container with ID starting with a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b not found: ID does not exist" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.011756 4744 scope.go:117] "RemoveContainer" containerID="8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26" Sep 30 03:15:08 crc kubenswrapper[4744]: E0930 03:15:08.012033 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26\": container with ID starting with 8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26 not found: ID does not exist" containerID="8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.012056 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26"} err="failed to get container status \"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26\": rpc error: code = NotFound desc = could not find container \"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26\": container with ID starting with 8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26 not found: ID does not exist" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.012069 4744 scope.go:117] "RemoveContainer" containerID="a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.012261 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b"} err="failed to get container status \"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b\": rpc error: code = NotFound desc = could not find container \"a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b\": container with ID starting with a2e18555864a00d3b3ac3948349e0732f73a7af1cb02c2a754997e784575136b not found: ID does not exist" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.012277 4744 scope.go:117] "RemoveContainer" containerID="8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.012439 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26"} err="failed to get container status \"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26\": rpc error: code = NotFound desc = could not find container \"8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26\": container with ID starting with 8de47fa646709ea037a9e4b1f8ed293f182b888c423750303305ff0adc156a26 not found: ID does not exist" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.094687 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-logs\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.095629 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-config-data\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.095761 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.095884 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.096024 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l5c2\" (UniqueName: \"kubernetes.io/projected/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-kube-api-access-5l5c2\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.197957 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l5c2\" (UniqueName: \"kubernetes.io/projected/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-kube-api-access-5l5c2\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.198035 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-logs\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.198066 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-config-data\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.198139 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.198216 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.199462 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-logs\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.202095 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-config-data\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.208038 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.211307 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.216183 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l5c2\" (UniqueName: \"kubernetes.io/projected/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-kube-api-access-5l5c2\") pod \"nova-metadata-0\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.284899 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.800947 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:08 crc kubenswrapper[4744]: W0930 03:15:08.804939 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode505a047_5448_43a5_8d2c_cc5bdb4db4ee.slice/crio-aa1d7844006ba51fc4e4c2bf449c76ba34c83d24060e87d962b432b945f738d6 WatchSource:0}: Error finding container aa1d7844006ba51fc4e4c2bf449c76ba34c83d24060e87d962b432b945f738d6: Status 404 returned error can't find the container with id aa1d7844006ba51fc4e4c2bf449c76ba34c83d24060e87d962b432b945f738d6 Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.867517 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e505a047-5448-43a5-8d2c-cc5bdb4db4ee","Type":"ContainerStarted","Data":"aa1d7844006ba51fc4e4c2bf449c76ba34c83d24060e87d962b432b945f738d6"} Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.871046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"45c58372-9d54-41ad-8059-5666ff3ab3c6","Type":"ContainerStarted","Data":"aeed869272fc36fefc21068291f854803f4ef5eb15f0bb088c6f17a0ab9d443a"} Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.871343 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.875077 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="264247e1-1195-4532-aafa-0eb43281e5b1" containerName="nova-scheduler-scheduler" containerID="cri-o://678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" gracePeriod=30 Sep 30 03:15:08 crc kubenswrapper[4744]: I0930 03:15:08.891510 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.891492962 podStartE2EDuration="2.891492962s" podCreationTimestamp="2025-09-30 03:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:08.885331852 +0000 UTC m=+1236.058551846" watchObservedRunningTime="2025-09-30 03:15:08.891492962 +0000 UTC m=+1236.064712946" Sep 30 03:15:09 crc kubenswrapper[4744]: I0930 03:15:09.523213 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04775e5c-eb80-4810-b804-03282f73d19e" path="/var/lib/kubelet/pods/04775e5c-eb80-4810-b804-03282f73d19e/volumes" Sep 30 03:15:09 crc kubenswrapper[4744]: I0930 03:15:09.524934 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="409d892b-c223-43a8-9550-ff50fb759e2b" path="/var/lib/kubelet/pods/409d892b-c223-43a8-9550-ff50fb759e2b/volumes" Sep 30 03:15:09 crc kubenswrapper[4744]: I0930 03:15:09.808192 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 03:15:09 crc kubenswrapper[4744]: I0930 03:15:09.912719 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e505a047-5448-43a5-8d2c-cc5bdb4db4ee","Type":"ContainerStarted","Data":"ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41"} Sep 30 03:15:09 crc kubenswrapper[4744]: I0930 03:15:09.912764 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e505a047-5448-43a5-8d2c-cc5bdb4db4ee","Type":"ContainerStarted","Data":"c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c"} Sep 30 03:15:09 crc kubenswrapper[4744]: I0930 03:15:09.967760 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.96773168 podStartE2EDuration="2.96773168s" podCreationTimestamp="2025-09-30 03:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:09.944347275 +0000 UTC m=+1237.117567289" watchObservedRunningTime="2025-09-30 03:15:09.96773168 +0000 UTC m=+1237.140951664" Sep 30 03:15:11 crc kubenswrapper[4744]: E0930 03:15:11.047906 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 03:15:11 crc kubenswrapper[4744]: E0930 03:15:11.052338 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 03:15:11 crc kubenswrapper[4744]: E0930 03:15:11.054744 4744 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 03:15:11 crc kubenswrapper[4744]: E0930 03:15:11.054855 4744 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="264247e1-1195-4532-aafa-0eb43281e5b1" containerName="nova-scheduler-scheduler" Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.812081 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.933571 4744 generic.go:334] "Generic (PLEG): container finished" podID="264247e1-1195-4532-aafa-0eb43281e5b1" containerID="678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" exitCode=0 Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.933628 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"264247e1-1195-4532-aafa-0eb43281e5b1","Type":"ContainerDied","Data":"678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db"} Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.933664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"264247e1-1195-4532-aafa-0eb43281e5b1","Type":"ContainerDied","Data":"32780447bc2d1b203d2a553040cf655970afaf84fae0e9bc72a494f563023f5b"} Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.933687 4744 scope.go:117] "RemoveContainer" containerID="678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.933855 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.975409 4744 scope.go:117] "RemoveContainer" containerID="678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" Sep 30 03:15:11 crc kubenswrapper[4744]: E0930 03:15:11.976404 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db\": container with ID starting with 678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db not found: ID does not exist" containerID="678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db" Sep 30 03:15:11 crc kubenswrapper[4744]: I0930 03:15:11.976557 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db"} err="failed to get container status \"678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db\": rpc error: code = NotFound desc = could not find container \"678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db\": container with ID starting with 678a062c90a46e35f8faff48f213746e116fb21eb105fd3ad9fedaeb756836db not found: ID does not exist" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.003044 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf25q\" (UniqueName: \"kubernetes.io/projected/264247e1-1195-4532-aafa-0eb43281e5b1-kube-api-access-pf25q\") pod \"264247e1-1195-4532-aafa-0eb43281e5b1\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.003252 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-config-data\") pod \"264247e1-1195-4532-aafa-0eb43281e5b1\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.003439 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-combined-ca-bundle\") pod \"264247e1-1195-4532-aafa-0eb43281e5b1\" (UID: \"264247e1-1195-4532-aafa-0eb43281e5b1\") " Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.015577 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264247e1-1195-4532-aafa-0eb43281e5b1-kube-api-access-pf25q" (OuterVolumeSpecName: "kube-api-access-pf25q") pod "264247e1-1195-4532-aafa-0eb43281e5b1" (UID: "264247e1-1195-4532-aafa-0eb43281e5b1"). InnerVolumeSpecName "kube-api-access-pf25q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.052541 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-config-data" (OuterVolumeSpecName: "config-data") pod "264247e1-1195-4532-aafa-0eb43281e5b1" (UID: "264247e1-1195-4532-aafa-0eb43281e5b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.065486 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "264247e1-1195-4532-aafa-0eb43281e5b1" (UID: "264247e1-1195-4532-aafa-0eb43281e5b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.106495 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.106562 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf25q\" (UniqueName: \"kubernetes.io/projected/264247e1-1195-4532-aafa-0eb43281e5b1-kube-api-access-pf25q\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.106587 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264247e1-1195-4532-aafa-0eb43281e5b1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.266459 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.276697 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.298635 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:12 crc kubenswrapper[4744]: E0930 03:15:12.299248 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264247e1-1195-4532-aafa-0eb43281e5b1" containerName="nova-scheduler-scheduler" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.299314 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="264247e1-1195-4532-aafa-0eb43281e5b1" containerName="nova-scheduler-scheduler" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.299562 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="264247e1-1195-4532-aafa-0eb43281e5b1" containerName="nova-scheduler-scheduler" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.300197 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.303162 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.312301 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.413791 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-config-data\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.414085 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.414494 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk5cn\" (UniqueName: \"kubernetes.io/projected/db5bd596-1d8b-418b-9239-8ff206093f2d-kube-api-access-xk5cn\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.516923 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk5cn\" (UniqueName: \"kubernetes.io/projected/db5bd596-1d8b-418b-9239-8ff206093f2d-kube-api-access-xk5cn\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.517363 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-config-data\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.517586 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.522431 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-config-data\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.530762 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.537803 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk5cn\" (UniqueName: \"kubernetes.io/projected/db5bd596-1d8b-418b-9239-8ff206093f2d-kube-api-access-xk5cn\") pod \"nova-scheduler-0\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.627821 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.770885 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.927146 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt4b4\" (UniqueName: \"kubernetes.io/projected/346dd188-63fa-4351-97ec-8e00be6cb731-kube-api-access-bt4b4\") pod \"346dd188-63fa-4351-97ec-8e00be6cb731\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.927265 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-combined-ca-bundle\") pod \"346dd188-63fa-4351-97ec-8e00be6cb731\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.927404 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-config-data\") pod \"346dd188-63fa-4351-97ec-8e00be6cb731\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.927466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/346dd188-63fa-4351-97ec-8e00be6cb731-logs\") pod \"346dd188-63fa-4351-97ec-8e00be6cb731\" (UID: \"346dd188-63fa-4351-97ec-8e00be6cb731\") " Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.929779 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346dd188-63fa-4351-97ec-8e00be6cb731-logs" (OuterVolumeSpecName: "logs") pod "346dd188-63fa-4351-97ec-8e00be6cb731" (UID: "346dd188-63fa-4351-97ec-8e00be6cb731"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.934136 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346dd188-63fa-4351-97ec-8e00be6cb731-kube-api-access-bt4b4" (OuterVolumeSpecName: "kube-api-access-bt4b4") pod "346dd188-63fa-4351-97ec-8e00be6cb731" (UID: "346dd188-63fa-4351-97ec-8e00be6cb731"). InnerVolumeSpecName "kube-api-access-bt4b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.952095 4744 generic.go:334] "Generic (PLEG): container finished" podID="346dd188-63fa-4351-97ec-8e00be6cb731" containerID="704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738" exitCode=0 Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.952149 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"346dd188-63fa-4351-97ec-8e00be6cb731","Type":"ContainerDied","Data":"704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738"} Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.952180 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"346dd188-63fa-4351-97ec-8e00be6cb731","Type":"ContainerDied","Data":"07268e825bd6dd21bd9e7930b1f132caf8a3f4644cf56be8ced3a846099156be"} Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.952200 4744 scope.go:117] "RemoveContainer" containerID="704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.952345 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.965112 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "346dd188-63fa-4351-97ec-8e00be6cb731" (UID: "346dd188-63fa-4351-97ec-8e00be6cb731"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.972407 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-config-data" (OuterVolumeSpecName: "config-data") pod "346dd188-63fa-4351-97ec-8e00be6cb731" (UID: "346dd188-63fa-4351-97ec-8e00be6cb731"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.973091 4744 scope.go:117] "RemoveContainer" containerID="07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.989752 4744 scope.go:117] "RemoveContainer" containerID="704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738" Sep 30 03:15:12 crc kubenswrapper[4744]: E0930 03:15:12.990260 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738\": container with ID starting with 704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738 not found: ID does not exist" containerID="704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.990303 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738"} err="failed to get container status \"704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738\": rpc error: code = NotFound desc = could not find container \"704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738\": container with ID starting with 704ecb0019d7374fafdc1ed88999398817a4799c1db9254385b57447bad81738 not found: ID does not exist" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.990331 4744 scope.go:117] "RemoveContainer" containerID="07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025" Sep 30 03:15:12 crc kubenswrapper[4744]: E0930 03:15:12.990631 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025\": container with ID starting with 07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025 not found: ID does not exist" containerID="07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025" Sep 30 03:15:12 crc kubenswrapper[4744]: I0930 03:15:12.990661 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025"} err="failed to get container status \"07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025\": rpc error: code = NotFound desc = could not find container \"07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025\": container with ID starting with 07e80ce7db0087cdf3aedee21772f1196955bc58c702bba735a43ef59f7c2025 not found: ID does not exist" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.030123 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.030162 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/346dd188-63fa-4351-97ec-8e00be6cb731-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.030175 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/346dd188-63fa-4351-97ec-8e00be6cb731-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.030191 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt4b4\" (UniqueName: \"kubernetes.io/projected/346dd188-63fa-4351-97ec-8e00be6cb731-kube-api-access-bt4b4\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.105983 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:13 crc kubenswrapper[4744]: W0930 03:15:13.106688 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb5bd596_1d8b_418b_9239_8ff206093f2d.slice/crio-3cd6928aaed68eaf444aaa9bfc5af4bf0c3a523a2e040da08f7d730bb8e95a4a WatchSource:0}: Error finding container 3cd6928aaed68eaf444aaa9bfc5af4bf0c3a523a2e040da08f7d730bb8e95a4a: Status 404 returned error can't find the container with id 3cd6928aaed68eaf444aaa9bfc5af4bf0c3a523a2e040da08f7d730bb8e95a4a Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.285873 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.285929 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.302648 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.312690 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.321660 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:13 crc kubenswrapper[4744]: E0930 03:15:13.322124 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-log" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.322145 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-log" Sep 30 03:15:13 crc kubenswrapper[4744]: E0930 03:15:13.322170 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-api" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.322179 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-api" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.322536 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-log" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.322574 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" containerName="nova-api-api" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.323815 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.326529 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.368668 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.449955 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.450086 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-config-data\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.450122 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzzqn\" (UniqueName: \"kubernetes.io/projected/502a21a1-d888-4be3-86cf-6bbe213ec9a9-kube-api-access-zzzqn\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.450166 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502a21a1-d888-4be3-86cf-6bbe213ec9a9-logs\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.524879 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264247e1-1195-4532-aafa-0eb43281e5b1" path="/var/lib/kubelet/pods/264247e1-1195-4532-aafa-0eb43281e5b1/volumes" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.525837 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346dd188-63fa-4351-97ec-8e00be6cb731" path="/var/lib/kubelet/pods/346dd188-63fa-4351-97ec-8e00be6cb731/volumes" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.552358 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzzqn\" (UniqueName: \"kubernetes.io/projected/502a21a1-d888-4be3-86cf-6bbe213ec9a9-kube-api-access-zzzqn\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.552476 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502a21a1-d888-4be3-86cf-6bbe213ec9a9-logs\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.552576 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.552686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-config-data\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.552992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502a21a1-d888-4be3-86cf-6bbe213ec9a9-logs\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.557268 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.561268 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-config-data\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.572699 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzzqn\" (UniqueName: \"kubernetes.io/projected/502a21a1-d888-4be3-86cf-6bbe213ec9a9-kube-api-access-zzzqn\") pod \"nova-api-0\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.679446 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.747877 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.748523 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a6379476-59f6-4c51-8f3f-7ea563d15030" containerName="kube-state-metrics" containerID="cri-o://ef7b9da0c8679b124b65e81d314aa715994931a172d5756599fac53a69589e6e" gracePeriod=30 Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.981508 4744 generic.go:334] "Generic (PLEG): container finished" podID="a6379476-59f6-4c51-8f3f-7ea563d15030" containerID="ef7b9da0c8679b124b65e81d314aa715994931a172d5756599fac53a69589e6e" exitCode=2 Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.981599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6379476-59f6-4c51-8f3f-7ea563d15030","Type":"ContainerDied","Data":"ef7b9da0c8679b124b65e81d314aa715994931a172d5756599fac53a69589e6e"} Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.983926 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"db5bd596-1d8b-418b-9239-8ff206093f2d","Type":"ContainerStarted","Data":"8d0c4b9649d615dbb39d654e2ab22eaacba7914f38569201a69bf2ab1f0eb241"} Sep 30 03:15:13 crc kubenswrapper[4744]: I0930 03:15:13.983977 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"db5bd596-1d8b-418b-9239-8ff206093f2d","Type":"ContainerStarted","Data":"3cd6928aaed68eaf444aaa9bfc5af4bf0c3a523a2e040da08f7d730bb8e95a4a"} Sep 30 03:15:14 crc kubenswrapper[4744]: I0930 03:15:14.001087 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.001070492 podStartE2EDuration="2.001070492s" podCreationTimestamp="2025-09-30 03:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:13.999440321 +0000 UTC m=+1241.172660295" watchObservedRunningTime="2025-09-30 03:15:14.001070492 +0000 UTC m=+1241.174290466" Sep 30 03:15:14 crc kubenswrapper[4744]: I0930 03:15:14.140497 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 03:15:14 crc kubenswrapper[4744]: I0930 03:15:14.166124 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:14 crc kubenswrapper[4744]: I0930 03:15:14.194598 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhsmk\" (UniqueName: \"kubernetes.io/projected/a6379476-59f6-4c51-8f3f-7ea563d15030-kube-api-access-fhsmk\") pod \"a6379476-59f6-4c51-8f3f-7ea563d15030\" (UID: \"a6379476-59f6-4c51-8f3f-7ea563d15030\") " Sep 30 03:15:14 crc kubenswrapper[4744]: I0930 03:15:14.203845 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6379476-59f6-4c51-8f3f-7ea563d15030-kube-api-access-fhsmk" (OuterVolumeSpecName: "kube-api-access-fhsmk") pod "a6379476-59f6-4c51-8f3f-7ea563d15030" (UID: "a6379476-59f6-4c51-8f3f-7ea563d15030"). InnerVolumeSpecName "kube-api-access-fhsmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:14 crc kubenswrapper[4744]: I0930 03:15:14.296919 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhsmk\" (UniqueName: \"kubernetes.io/projected/a6379476-59f6-4c51-8f3f-7ea563d15030-kube-api-access-fhsmk\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:14 crc kubenswrapper[4744]: I0930 03:15:14.999623 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6379476-59f6-4c51-8f3f-7ea563d15030","Type":"ContainerDied","Data":"950e98f0ae559603c3582a46c318149c09b8522b5df26b9918b1ca516d4231df"} Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.000145 4744 scope.go:117] "RemoveContainer" containerID="ef7b9da0c8679b124b65e81d314aa715994931a172d5756599fac53a69589e6e" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.000395 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.010116 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"502a21a1-d888-4be3-86cf-6bbe213ec9a9","Type":"ContainerStarted","Data":"e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93"} Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.010178 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"502a21a1-d888-4be3-86cf-6bbe213ec9a9","Type":"ContainerStarted","Data":"45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9"} Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.010193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"502a21a1-d888-4be3-86cf-6bbe213ec9a9","Type":"ContainerStarted","Data":"9456b997f46bab4e45f8d2ddd500556858f2a38f566bd4e577683d3277f7a368"} Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.094730 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.094706329 podStartE2EDuration="2.094706329s" podCreationTimestamp="2025-09-30 03:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:15.066809563 +0000 UTC m=+1242.240029547" watchObservedRunningTime="2025-09-30 03:15:15.094706329 +0000 UTC m=+1242.267926313" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.118891 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.132474 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.148674 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:15:15 crc kubenswrapper[4744]: E0930 03:15:15.149065 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6379476-59f6-4c51-8f3f-7ea563d15030" containerName="kube-state-metrics" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.149076 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6379476-59f6-4c51-8f3f-7ea563d15030" containerName="kube-state-metrics" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.149253 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6379476-59f6-4c51-8f3f-7ea563d15030" containerName="kube-state-metrics" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.149894 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.153692 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.154283 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.154883 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.215832 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.215883 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lvt\" (UniqueName: \"kubernetes.io/projected/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-api-access-w5lvt\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.215905 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.216091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.317467 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.317519 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.317545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lvt\" (UniqueName: \"kubernetes.io/projected/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-api-access-w5lvt\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.317564 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.322582 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.323857 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.331479 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.332967 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lvt\" (UniqueName: \"kubernetes.io/projected/825e7c08-c607-429e-bf96-d8c332d03cd1-kube-api-access-w5lvt\") pod \"kube-state-metrics-0\" (UID: \"825e7c08-c607-429e-bf96-d8c332d03cd1\") " pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.467667 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.522042 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6379476-59f6-4c51-8f3f-7ea563d15030" path="/var/lib/kubelet/pods/a6379476-59f6-4c51-8f3f-7ea563d15030/volumes" Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.771105 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.771403 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-central-agent" containerID="cri-o://55d8be7f596fdf9f46e94e98397b057a6a8862e326f50d7221f6f7cdcbc5e986" gracePeriod=30 Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.771425 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="proxy-httpd" containerID="cri-o://5d9de43c392f1dd77c2482e42938e5a0ff5c21049ca56a5073f48aa6162d9f11" gracePeriod=30 Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.771477 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="sg-core" containerID="cri-o://9b93a3298be23f7ea43796c8d48523d96a63138e594b2576bfd24ee5e49d4a83" gracePeriod=30 Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.771506 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-notification-agent" containerID="cri-o://7e814f468532a2b9aae79387b6dad42116d6571f3d39ed7e788f0f09d3fdfc01" gracePeriod=30 Sep 30 03:15:15 crc kubenswrapper[4744]: I0930 03:15:15.940647 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 03:15:16 crc kubenswrapper[4744]: I0930 03:15:16.023772 4744 generic.go:334] "Generic (PLEG): container finished" podID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerID="5d9de43c392f1dd77c2482e42938e5a0ff5c21049ca56a5073f48aa6162d9f11" exitCode=0 Sep 30 03:15:16 crc kubenswrapper[4744]: I0930 03:15:16.023828 4744 generic.go:334] "Generic (PLEG): container finished" podID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerID="9b93a3298be23f7ea43796c8d48523d96a63138e594b2576bfd24ee5e49d4a83" exitCode=2 Sep 30 03:15:16 crc kubenswrapper[4744]: I0930 03:15:16.023860 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerDied","Data":"5d9de43c392f1dd77c2482e42938e5a0ff5c21049ca56a5073f48aa6162d9f11"} Sep 30 03:15:16 crc kubenswrapper[4744]: I0930 03:15:16.024135 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerDied","Data":"9b93a3298be23f7ea43796c8d48523d96a63138e594b2576bfd24ee5e49d4a83"} Sep 30 03:15:16 crc kubenswrapper[4744]: I0930 03:15:16.027320 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"825e7c08-c607-429e-bf96-d8c332d03cd1","Type":"ContainerStarted","Data":"67d95f2c5405af828c84c454f78c27c2b9c86c2c3ecc1ca7257270abf8344f0e"} Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.042094 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"825e7c08-c607-429e-bf96-d8c332d03cd1","Type":"ContainerStarted","Data":"470b5bf38f1b30be86e7c328e3a653ca03c3ffb255a70dbeeac25b1938d2658f"} Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.043357 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.048037 4744 generic.go:334] "Generic (PLEG): container finished" podID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerID="7e814f468532a2b9aae79387b6dad42116d6571f3d39ed7e788f0f09d3fdfc01" exitCode=0 Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.048065 4744 generic.go:334] "Generic (PLEG): container finished" podID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerID="55d8be7f596fdf9f46e94e98397b057a6a8862e326f50d7221f6f7cdcbc5e986" exitCode=0 Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.048084 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerDied","Data":"7e814f468532a2b9aae79387b6dad42116d6571f3d39ed7e788f0f09d3fdfc01"} Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.048101 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerDied","Data":"55d8be7f596fdf9f46e94e98397b057a6a8862e326f50d7221f6f7cdcbc5e986"} Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.072821 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.688159718 podStartE2EDuration="2.072790973s" podCreationTimestamp="2025-09-30 03:15:15 +0000 UTC" firstStartedPulling="2025-09-30 03:15:15.954300724 +0000 UTC m=+1243.127520688" lastFinishedPulling="2025-09-30 03:15:16.338931959 +0000 UTC m=+1243.512151943" observedRunningTime="2025-09-30 03:15:17.062289496 +0000 UTC m=+1244.235509470" watchObservedRunningTime="2025-09-30 03:15:17.072790973 +0000 UTC m=+1244.246010967" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.282884 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.323604 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.493663 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-combined-ca-bundle\") pod \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.493762 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-scripts\") pod \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.493796 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-log-httpd\") pod \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.493826 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-sg-core-conf-yaml\") pod \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.493884 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwfk5\" (UniqueName: \"kubernetes.io/projected/fbfa47fe-9e61-4f18-91d2-e6af1296f033-kube-api-access-bwfk5\") pod \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.493928 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-run-httpd\") pod \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.493984 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-config-data\") pod \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\" (UID: \"fbfa47fe-9e61-4f18-91d2-e6af1296f033\") " Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.494474 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fbfa47fe-9e61-4f18-91d2-e6af1296f033" (UID: "fbfa47fe-9e61-4f18-91d2-e6af1296f033"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.495050 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fbfa47fe-9e61-4f18-91d2-e6af1296f033" (UID: "fbfa47fe-9e61-4f18-91d2-e6af1296f033"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.511572 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-scripts" (OuterVolumeSpecName: "scripts") pod "fbfa47fe-9e61-4f18-91d2-e6af1296f033" (UID: "fbfa47fe-9e61-4f18-91d2-e6af1296f033"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.511580 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbfa47fe-9e61-4f18-91d2-e6af1296f033-kube-api-access-bwfk5" (OuterVolumeSpecName: "kube-api-access-bwfk5") pod "fbfa47fe-9e61-4f18-91d2-e6af1296f033" (UID: "fbfa47fe-9e61-4f18-91d2-e6af1296f033"). InnerVolumeSpecName "kube-api-access-bwfk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.547275 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fbfa47fe-9e61-4f18-91d2-e6af1296f033" (UID: "fbfa47fe-9e61-4f18-91d2-e6af1296f033"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.586267 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbfa47fe-9e61-4f18-91d2-e6af1296f033" (UID: "fbfa47fe-9e61-4f18-91d2-e6af1296f033"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.595904 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.595928 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.595938 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.595947 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.595957 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwfk5\" (UniqueName: \"kubernetes.io/projected/fbfa47fe-9e61-4f18-91d2-e6af1296f033-kube-api-access-bwfk5\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.595969 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbfa47fe-9e61-4f18-91d2-e6af1296f033-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.605582 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-config-data" (OuterVolumeSpecName: "config-data") pod "fbfa47fe-9e61-4f18-91d2-e6af1296f033" (UID: "fbfa47fe-9e61-4f18-91d2-e6af1296f033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.687875 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 03:15:17 crc kubenswrapper[4744]: I0930 03:15:17.698644 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfa47fe-9e61-4f18-91d2-e6af1296f033-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.062696 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbfa47fe-9e61-4f18-91d2-e6af1296f033","Type":"ContainerDied","Data":"5ab0806f7c6431db3668c3af4015e967167ffc39ca8a55ec22cc4b85659e6eb1"} Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.062734 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.062764 4744 scope.go:117] "RemoveContainer" containerID="5d9de43c392f1dd77c2482e42938e5a0ff5c21049ca56a5073f48aa6162d9f11" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.089544 4744 scope.go:117] "RemoveContainer" containerID="9b93a3298be23f7ea43796c8d48523d96a63138e594b2576bfd24ee5e49d4a83" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.109687 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.114866 4744 scope.go:117] "RemoveContainer" containerID="7e814f468532a2b9aae79387b6dad42116d6571f3d39ed7e788f0f09d3fdfc01" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.133860 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.142431 4744 scope.go:117] "RemoveContainer" containerID="55d8be7f596fdf9f46e94e98397b057a6a8862e326f50d7221f6f7cdcbc5e986" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.149735 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:18 crc kubenswrapper[4744]: E0930 03:15:18.150305 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-central-agent" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150323 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-central-agent" Sep 30 03:15:18 crc kubenswrapper[4744]: E0930 03:15:18.150340 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-notification-agent" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150346 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-notification-agent" Sep 30 03:15:18 crc kubenswrapper[4744]: E0930 03:15:18.150355 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="sg-core" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150361 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="sg-core" Sep 30 03:15:18 crc kubenswrapper[4744]: E0930 03:15:18.150462 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="proxy-httpd" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150471 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="proxy-httpd" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150767 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="proxy-httpd" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150786 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="sg-core" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150798 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-central-agent" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.150806 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" containerName="ceilometer-notification-agent" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.153011 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.158362 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.158638 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.158795 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.158796 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208631 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208706 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24gbf\" (UniqueName: \"kubernetes.io/projected/45dcef2c-67a3-462d-8039-848684f60ede-kube-api-access-24gbf\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208742 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208784 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-log-httpd\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208825 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-run-httpd\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208873 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-scripts\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.208902 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-config-data\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.287055 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.287139 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311130 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24gbf\" (UniqueName: \"kubernetes.io/projected/45dcef2c-67a3-462d-8039-848684f60ede-kube-api-access-24gbf\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311183 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311237 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-log-httpd\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311258 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311278 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-run-httpd\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311334 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-scripts\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311383 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-config-data\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.311424 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.312592 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-run-httpd\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.312621 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-log-httpd\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.315732 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-scripts\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.316691 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.316866 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.317525 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-config-data\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.319918 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.333660 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24gbf\" (UniqueName: \"kubernetes.io/projected/45dcef2c-67a3-462d-8039-848684f60ede-kube-api-access-24gbf\") pod \"ceilometer-0\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " pod="openstack/ceilometer-0" Sep 30 03:15:18 crc kubenswrapper[4744]: I0930 03:15:18.480073 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:19 crc kubenswrapper[4744]: W0930 03:15:19.017705 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45dcef2c_67a3_462d_8039_848684f60ede.slice/crio-8ef2d8842a3ee90e9f0767003a9d1d885e9cae43350a63b7d7379cdc0e7db235 WatchSource:0}: Error finding container 8ef2d8842a3ee90e9f0767003a9d1d885e9cae43350a63b7d7379cdc0e7db235: Status 404 returned error can't find the container with id 8ef2d8842a3ee90e9f0767003a9d1d885e9cae43350a63b7d7379cdc0e7db235 Sep 30 03:15:19 crc kubenswrapper[4744]: I0930 03:15:19.021484 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:19 crc kubenswrapper[4744]: I0930 03:15:19.073167 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerStarted","Data":"8ef2d8842a3ee90e9f0767003a9d1d885e9cae43350a63b7d7379cdc0e7db235"} Sep 30 03:15:19 crc kubenswrapper[4744]: I0930 03:15:19.296599 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 03:15:19 crc kubenswrapper[4744]: I0930 03:15:19.296683 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 03:15:19 crc kubenswrapper[4744]: I0930 03:15:19.516190 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbfa47fe-9e61-4f18-91d2-e6af1296f033" path="/var/lib/kubelet/pods/fbfa47fe-9e61-4f18-91d2-e6af1296f033/volumes" Sep 30 03:15:20 crc kubenswrapper[4744]: I0930 03:15:20.086785 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerStarted","Data":"4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c"} Sep 30 03:15:21 crc kubenswrapper[4744]: I0930 03:15:21.103261 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerStarted","Data":"428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a"} Sep 30 03:15:21 crc kubenswrapper[4744]: I0930 03:15:21.103987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerStarted","Data":"1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917"} Sep 30 03:15:22 crc kubenswrapper[4744]: I0930 03:15:22.628616 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 03:15:22 crc kubenswrapper[4744]: I0930 03:15:22.677630 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 03:15:23 crc kubenswrapper[4744]: I0930 03:15:23.129046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerStarted","Data":"3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4"} Sep 30 03:15:23 crc kubenswrapper[4744]: I0930 03:15:23.158972 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.947786378 podStartE2EDuration="5.158923315s" podCreationTimestamp="2025-09-30 03:15:18 +0000 UTC" firstStartedPulling="2025-09-30 03:15:19.022783564 +0000 UTC m=+1246.196003538" lastFinishedPulling="2025-09-30 03:15:22.233920491 +0000 UTC m=+1249.407140475" observedRunningTime="2025-09-30 03:15:23.1545602 +0000 UTC m=+1250.327780204" watchObservedRunningTime="2025-09-30 03:15:23.158923315 +0000 UTC m=+1250.332143289" Sep 30 03:15:23 crc kubenswrapper[4744]: I0930 03:15:23.189686 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 03:15:23 crc kubenswrapper[4744]: I0930 03:15:23.679988 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 03:15:23 crc kubenswrapper[4744]: I0930 03:15:23.680047 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 03:15:24 crc kubenswrapper[4744]: I0930 03:15:24.135431 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:15:24 crc kubenswrapper[4744]: I0930 03:15:24.762616 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 03:15:24 crc kubenswrapper[4744]: I0930 03:15:24.762767 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 03:15:25 crc kubenswrapper[4744]: I0930 03:15:25.484168 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 03:15:28 crc kubenswrapper[4744]: I0930 03:15:28.293643 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 03:15:28 crc kubenswrapper[4744]: I0930 03:15:28.295311 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 03:15:28 crc kubenswrapper[4744]: I0930 03:15:28.303354 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 03:15:28 crc kubenswrapper[4744]: I0930 03:15:28.310775 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.210133 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.226626 4744 generic.go:334] "Generic (PLEG): container finished" podID="b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" containerID="2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba" exitCode=137 Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.226669 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374","Type":"ContainerDied","Data":"2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba"} Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.226821 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374","Type":"ContainerDied","Data":"6bf5cf6ba1c41efdfb6e3555c5b77113bb68f9a68579e56b53675add9d6a3a02"} Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.226775 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.227392 4744 scope.go:117] "RemoveContainer" containerID="2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.281610 4744 scope.go:117] "RemoveContainer" containerID="2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba" Sep 30 03:15:31 crc kubenswrapper[4744]: E0930 03:15:31.282026 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba\": container with ID starting with 2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba not found: ID does not exist" containerID="2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.282067 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba"} err="failed to get container status \"2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba\": rpc error: code = NotFound desc = could not find container \"2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba\": container with ID starting with 2025dbfc2a220919266b21ea361765dce688ceb5f456a15d57a91c615ffeb5ba not found: ID does not exist" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.360957 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-combined-ca-bundle\") pod \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.361052 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-config-data\") pod \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.361153 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p46d8\" (UniqueName: \"kubernetes.io/projected/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-kube-api-access-p46d8\") pod \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\" (UID: \"b10cc4e5-8c8d-4012-b0ef-3c7beaea3374\") " Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.374122 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-kube-api-access-p46d8" (OuterVolumeSpecName: "kube-api-access-p46d8") pod "b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" (UID: "b10cc4e5-8c8d-4012-b0ef-3c7beaea3374"). InnerVolumeSpecName "kube-api-access-p46d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.393834 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" (UID: "b10cc4e5-8c8d-4012-b0ef-3c7beaea3374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.413847 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-config-data" (OuterVolumeSpecName: "config-data") pod "b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" (UID: "b10cc4e5-8c8d-4012-b0ef-3c7beaea3374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.465066 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.465111 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.465129 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p46d8\" (UniqueName: \"kubernetes.io/projected/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374-kube-api-access-p46d8\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.602719 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.628713 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.649735 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:15:31 crc kubenswrapper[4744]: E0930 03:15:31.652204 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.652263 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.652805 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.657260 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.659562 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.660243 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.660274 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.667352 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.675539 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.675669 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.675729 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.675752 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fszlg\" (UniqueName: \"kubernetes.io/projected/843d7ca4-8741-4c46-9e24-c432261d5c57-kube-api-access-fszlg\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.675805 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.778086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.778179 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.778204 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fszlg\" (UniqueName: \"kubernetes.io/projected/843d7ca4-8741-4c46-9e24-c432261d5c57-kube-api-access-fszlg\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.778265 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.778381 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.783186 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.783287 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.783346 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.784272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/843d7ca4-8741-4c46-9e24-c432261d5c57-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.797121 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fszlg\" (UniqueName: \"kubernetes.io/projected/843d7ca4-8741-4c46-9e24-c432261d5c57-kube-api-access-fszlg\") pod \"nova-cell1-novncproxy-0\" (UID: \"843d7ca4-8741-4c46-9e24-c432261d5c57\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:31 crc kubenswrapper[4744]: I0930 03:15:31.978884 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:33 crc kubenswrapper[4744]: I0930 03:15:33.067744 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 03:15:33 crc kubenswrapper[4744]: I0930 03:15:33.253591 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"843d7ca4-8741-4c46-9e24-c432261d5c57","Type":"ContainerStarted","Data":"eb78418318742d618b8e5cd6247c72a9f0b322780ce4bfa90b969a44b8e5ef7e"} Sep 30 03:15:33 crc kubenswrapper[4744]: I0930 03:15:33.529290 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b10cc4e5-8c8d-4012-b0ef-3c7beaea3374" path="/var/lib/kubelet/pods/b10cc4e5-8c8d-4012-b0ef-3c7beaea3374/volumes" Sep 30 03:15:33 crc kubenswrapper[4744]: I0930 03:15:33.683729 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 03:15:33 crc kubenswrapper[4744]: I0930 03:15:33.684611 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 03:15:33 crc kubenswrapper[4744]: I0930 03:15:33.685935 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 03:15:33 crc kubenswrapper[4744]: I0930 03:15:33.687282 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.266666 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"843d7ca4-8741-4c46-9e24-c432261d5c57","Type":"ContainerStarted","Data":"c93afa73d0a9727e52d2a2680bbf811f3514eed46d7905f104e8e23231ba9a54"} Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.266705 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.271604 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.297964 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.297943328 podStartE2EDuration="3.297943328s" podCreationTimestamp="2025-09-30 03:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:34.288446223 +0000 UTC m=+1261.461666227" watchObservedRunningTime="2025-09-30 03:15:34.297943328 +0000 UTC m=+1261.471163312" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.358941 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.359025 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.535998 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-s5klg"] Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.537510 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.554805 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-s5klg"] Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.642460 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-sb\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.642518 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncmsf\" (UniqueName: \"kubernetes.io/projected/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-kube-api-access-ncmsf\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.642565 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-swift-storage-0\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.642599 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-nb\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.642633 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-config\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.642666 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-svc\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.744630 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-sb\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.744710 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncmsf\" (UniqueName: \"kubernetes.io/projected/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-kube-api-access-ncmsf\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.744776 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-swift-storage-0\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.744820 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-nb\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.744867 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-config\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.744912 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-svc\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.746067 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-sb\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.746068 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-swift-storage-0\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.746103 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-svc\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.746694 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-nb\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.746703 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-config\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.765215 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncmsf\" (UniqueName: \"kubernetes.io/projected/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-kube-api-access-ncmsf\") pod \"dnsmasq-dns-6559f4fbd7-s5klg\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:34 crc kubenswrapper[4744]: I0930 03:15:34.863705 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:35 crc kubenswrapper[4744]: W0930 03:15:35.420576 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda54a9e4e_1abf_4e10_a7cb_98c582b531fa.slice/crio-d769b7b8b1ece59bf8c56d9cef226bd2d6faf4acb0af5014770ad222b7129a46 WatchSource:0}: Error finding container d769b7b8b1ece59bf8c56d9cef226bd2d6faf4acb0af5014770ad222b7129a46: Status 404 returned error can't find the container with id d769b7b8b1ece59bf8c56d9cef226bd2d6faf4acb0af5014770ad222b7129a46 Sep 30 03:15:35 crc kubenswrapper[4744]: I0930 03:15:35.423317 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-s5klg"] Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.285662 4744 generic.go:334] "Generic (PLEG): container finished" podID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerID="98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347" exitCode=0 Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.285823 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" event={"ID":"a54a9e4e-1abf-4e10-a7cb-98c582b531fa","Type":"ContainerDied","Data":"98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347"} Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.287067 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" event={"ID":"a54a9e4e-1abf-4e10-a7cb-98c582b531fa","Type":"ContainerStarted","Data":"d769b7b8b1ece59bf8c56d9cef226bd2d6faf4acb0af5014770ad222b7129a46"} Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.522507 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.523004 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-central-agent" containerID="cri-o://4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c" gracePeriod=30 Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.523119 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-notification-agent" containerID="cri-o://1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917" gracePeriod=30 Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.523154 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="proxy-httpd" containerID="cri-o://3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4" gracePeriod=30 Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.523122 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="sg-core" containerID="cri-o://428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a" gracePeriod=30 Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.528867 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.214:3000/\": read tcp 10.217.0.2:45060->10.217.0.214:3000: read: connection reset by peer" Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.854670 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:36 crc kubenswrapper[4744]: I0930 03:15:36.979278 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.299143 4744 generic.go:334] "Generic (PLEG): container finished" podID="45dcef2c-67a3-462d-8039-848684f60ede" containerID="3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4" exitCode=0 Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.299172 4744 generic.go:334] "Generic (PLEG): container finished" podID="45dcef2c-67a3-462d-8039-848684f60ede" containerID="428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a" exitCode=2 Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.299180 4744 generic.go:334] "Generic (PLEG): container finished" podID="45dcef2c-67a3-462d-8039-848684f60ede" containerID="4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c" exitCode=0 Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.299208 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerDied","Data":"3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4"} Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.299256 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerDied","Data":"428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a"} Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.299268 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerDied","Data":"4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c"} Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.301404 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" event={"ID":"a54a9e4e-1abf-4e10-a7cb-98c582b531fa","Type":"ContainerStarted","Data":"8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a"} Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.301511 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-log" containerID="cri-o://45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9" gracePeriod=30 Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.301565 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-api" containerID="cri-o://e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93" gracePeriod=30 Sep 30 03:15:37 crc kubenswrapper[4744]: I0930 03:15:37.332485 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" podStartSLOduration=3.332461873 podStartE2EDuration="3.332461873s" podCreationTimestamp="2025-09-30 03:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:37.324160836 +0000 UTC m=+1264.497380810" watchObservedRunningTime="2025-09-30 03:15:37.332461873 +0000 UTC m=+1264.505681867" Sep 30 03:15:38 crc kubenswrapper[4744]: I0930 03:15:38.313899 4744 generic.go:334] "Generic (PLEG): container finished" podID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerID="45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9" exitCode=143 Sep 30 03:15:38 crc kubenswrapper[4744]: I0930 03:15:38.314021 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"502a21a1-d888-4be3-86cf-6bbe213ec9a9","Type":"ContainerDied","Data":"45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9"} Sep 30 03:15:38 crc kubenswrapper[4744]: I0930 03:15:38.314124 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:40 crc kubenswrapper[4744]: I0930 03:15:40.989490 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.090152 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzzqn\" (UniqueName: \"kubernetes.io/projected/502a21a1-d888-4be3-86cf-6bbe213ec9a9-kube-api-access-zzzqn\") pod \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.091394 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502a21a1-d888-4be3-86cf-6bbe213ec9a9-logs\") pod \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.091425 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-config-data\") pod \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.091465 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-combined-ca-bundle\") pod \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\" (UID: \"502a21a1-d888-4be3-86cf-6bbe213ec9a9\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.092190 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/502a21a1-d888-4be3-86cf-6bbe213ec9a9-logs" (OuterVolumeSpecName: "logs") pod "502a21a1-d888-4be3-86cf-6bbe213ec9a9" (UID: "502a21a1-d888-4be3-86cf-6bbe213ec9a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.096591 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502a21a1-d888-4be3-86cf-6bbe213ec9a9-kube-api-access-zzzqn" (OuterVolumeSpecName: "kube-api-access-zzzqn") pod "502a21a1-d888-4be3-86cf-6bbe213ec9a9" (UID: "502a21a1-d888-4be3-86cf-6bbe213ec9a9"). InnerVolumeSpecName "kube-api-access-zzzqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.132139 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-config-data" (OuterVolumeSpecName: "config-data") pod "502a21a1-d888-4be3-86cf-6bbe213ec9a9" (UID: "502a21a1-d888-4be3-86cf-6bbe213ec9a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.149944 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "502a21a1-d888-4be3-86cf-6bbe213ec9a9" (UID: "502a21a1-d888-4be3-86cf-6bbe213ec9a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.193857 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/502a21a1-d888-4be3-86cf-6bbe213ec9a9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.193895 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.193912 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502a21a1-d888-4be3-86cf-6bbe213ec9a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.193929 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzzqn\" (UniqueName: \"kubernetes.io/projected/502a21a1-d888-4be3-86cf-6bbe213ec9a9-kube-api-access-zzzqn\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.196139 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.294926 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-scripts\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295021 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-run-httpd\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295088 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-ceilometer-tls-certs\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295237 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-log-httpd\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295273 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-config-data\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295300 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24gbf\" (UniqueName: \"kubernetes.io/projected/45dcef2c-67a3-462d-8039-848684f60ede-kube-api-access-24gbf\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295325 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-sg-core-conf-yaml\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295471 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-combined-ca-bundle\") pod \"45dcef2c-67a3-462d-8039-848684f60ede\" (UID: \"45dcef2c-67a3-462d-8039-848684f60ede\") " Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295625 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.295727 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.296016 4744 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.296043 4744 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45dcef2c-67a3-462d-8039-848684f60ede-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.298491 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-scripts" (OuterVolumeSpecName: "scripts") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.299054 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45dcef2c-67a3-462d-8039-848684f60ede-kube-api-access-24gbf" (OuterVolumeSpecName: "kube-api-access-24gbf") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "kube-api-access-24gbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.322695 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.358328 4744 generic.go:334] "Generic (PLEG): container finished" podID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerID="e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93" exitCode=0 Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.358434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"502a21a1-d888-4be3-86cf-6bbe213ec9a9","Type":"ContainerDied","Data":"e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93"} Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.358463 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"502a21a1-d888-4be3-86cf-6bbe213ec9a9","Type":"ContainerDied","Data":"9456b997f46bab4e45f8d2ddd500556858f2a38f566bd4e577683d3277f7a368"} Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.358480 4744 scope.go:117] "RemoveContainer" containerID="e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.358591 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.358857 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.366987 4744 generic.go:334] "Generic (PLEG): container finished" podID="45dcef2c-67a3-462d-8039-848684f60ede" containerID="1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917" exitCode=0 Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.367022 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerDied","Data":"1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917"} Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.367046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45dcef2c-67a3-462d-8039-848684f60ede","Type":"ContainerDied","Data":"8ef2d8842a3ee90e9f0767003a9d1d885e9cae43350a63b7d7379cdc0e7db235"} Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.367109 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.380155 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.397501 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.397538 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.397554 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24gbf\" (UniqueName: \"kubernetes.io/projected/45dcef2c-67a3-462d-8039-848684f60ede-kube-api-access-24gbf\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.397565 4744 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.397576 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.409897 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.421082 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.422506 4744 scope.go:117] "RemoveContainer" containerID="45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.426098 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-config-data" (OuterVolumeSpecName: "config-data") pod "45dcef2c-67a3-462d-8039-848684f60ede" (UID: "45dcef2c-67a3-462d-8039-848684f60ede"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.442913 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.443577 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="proxy-httpd" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.443660 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="proxy-httpd" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.443757 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-central-agent" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.443817 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-central-agent" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.443922 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="sg-core" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.443989 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="sg-core" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.444072 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-api" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.444237 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-api" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.444334 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-log" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.444428 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-log" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.444520 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-notification-agent" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.444603 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-notification-agent" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.444923 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-notification-agent" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.445024 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="ceilometer-central-agent" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.445128 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-log" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.445221 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" containerName="nova-api-api" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.445308 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="proxy-httpd" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.445407 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="45dcef2c-67a3-462d-8039-848684f60ede" containerName="sg-core" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.445985 4744 scope.go:117] "RemoveContainer" containerID="e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.446733 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93\": container with ID starting with e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93 not found: ID does not exist" containerID="e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.446763 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93"} err="failed to get container status \"e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93\": rpc error: code = NotFound desc = could not find container \"e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93\": container with ID starting with e9266ab4d884d1d0190b0d1c1530f5dbecee90b52fae2804af9b50f31182cb93 not found: ID does not exist" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.446783 4744 scope.go:117] "RemoveContainer" containerID="45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.447246 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.447405 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.448323 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9\": container with ID starting with 45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9 not found: ID does not exist" containerID="45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.448347 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9"} err="failed to get container status \"45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9\": rpc error: code = NotFound desc = could not find container \"45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9\": container with ID starting with 45cbd506da518c6dab421e363cf76696bf868205a81bad8ead568cde67ad5fb9 not found: ID does not exist" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.448361 4744 scope.go:117] "RemoveContainer" containerID="3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.450093 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.450251 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.451912 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.477172 4744 scope.go:117] "RemoveContainer" containerID="428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.495853 4744 scope.go:117] "RemoveContainer" containerID="1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.499171 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45dcef2c-67a3-462d-8039-848684f60ede-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.513017 4744 scope.go:117] "RemoveContainer" containerID="4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.517318 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502a21a1-d888-4be3-86cf-6bbe213ec9a9" path="/var/lib/kubelet/pods/502a21a1-d888-4be3-86cf-6bbe213ec9a9/volumes" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.533269 4744 scope.go:117] "RemoveContainer" containerID="3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.533830 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4\": container with ID starting with 3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4 not found: ID does not exist" containerID="3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.533912 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4"} err="failed to get container status \"3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4\": rpc error: code = NotFound desc = could not find container \"3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4\": container with ID starting with 3d7f30fcfbba3139cfb1d3ef58b33e8f27c99f146cbdd40c02445a106c28e8a4 not found: ID does not exist" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.533967 4744 scope.go:117] "RemoveContainer" containerID="428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.534316 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a\": container with ID starting with 428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a not found: ID does not exist" containerID="428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.534345 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a"} err="failed to get container status \"428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a\": rpc error: code = NotFound desc = could not find container \"428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a\": container with ID starting with 428c5668ad2bb9a0f0c802e4e0c842c80e26d5dcb4efbaba121617210a15921a not found: ID does not exist" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.534363 4744 scope.go:117] "RemoveContainer" containerID="1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.534602 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917\": container with ID starting with 1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917 not found: ID does not exist" containerID="1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.534624 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917"} err="failed to get container status \"1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917\": rpc error: code = NotFound desc = could not find container \"1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917\": container with ID starting with 1c5cf3aa6d7f13619e1bc94e89bab207ab9ad9316a8a8b87ba9daf18fe182917 not found: ID does not exist" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.534665 4744 scope.go:117] "RemoveContainer" containerID="4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c" Sep 30 03:15:41 crc kubenswrapper[4744]: E0930 03:15:41.534877 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c\": container with ID starting with 4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c not found: ID does not exist" containerID="4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.534941 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c"} err="failed to get container status \"4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c\": rpc error: code = NotFound desc = could not find container \"4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c\": container with ID starting with 4d47e41b6aa31dcf725914a3707a5651e65e2a9a3876e8a4ca4547b62a9b9a7c not found: ID does not exist" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.600953 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-logs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.601026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.601082 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-public-tls-certs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.601105 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.601138 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmw9n\" (UniqueName: \"kubernetes.io/projected/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-kube-api-access-lmw9n\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.601161 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-config-data\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.696399 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.703267 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmw9n\" (UniqueName: \"kubernetes.io/projected/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-kube-api-access-lmw9n\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.703332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-config-data\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.703473 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-logs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.703527 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.703592 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-public-tls-certs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.703630 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.704640 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-logs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.710118 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.712926 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.713133 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-config-data\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.713526 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.716139 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-public-tls-certs\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.726022 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.728403 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.730486 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmw9n\" (UniqueName: \"kubernetes.io/projected/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-kube-api-access-lmw9n\") pod \"nova-api-0\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.730624 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.730680 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.730694 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.747505 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.761359 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.907369 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.907846 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.907866 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxtfs\" (UniqueName: \"kubernetes.io/projected/ea2be999-3323-4e60-b44e-641418e67b04-kube-api-access-sxtfs\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.907888 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-config-data\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.907964 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2be999-3323-4e60-b44e-641418e67b04-run-httpd\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.907995 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2be999-3323-4e60-b44e-641418e67b04-log-httpd\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.908025 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.908059 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-scripts\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:41 crc kubenswrapper[4744]: I0930 03:15:41.980247 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.002767 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010270 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010339 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010370 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxtfs\" (UniqueName: \"kubernetes.io/projected/ea2be999-3323-4e60-b44e-641418e67b04-kube-api-access-sxtfs\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010428 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-config-data\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010545 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2be999-3323-4e60-b44e-641418e67b04-run-httpd\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010591 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2be999-3323-4e60-b44e-641418e67b04-log-httpd\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010636 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.010680 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-scripts\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.011944 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2be999-3323-4e60-b44e-641418e67b04-run-httpd\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.012197 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea2be999-3323-4e60-b44e-641418e67b04-log-httpd\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.015824 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.017725 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-config-data\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.018860 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-scripts\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.021887 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.022304 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2be999-3323-4e60-b44e-641418e67b04-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.027907 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxtfs\" (UniqueName: \"kubernetes.io/projected/ea2be999-3323-4e60-b44e-641418e67b04-kube-api-access-sxtfs\") pod \"ceilometer-0\" (UID: \"ea2be999-3323-4e60-b44e-641418e67b04\") " pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.190049 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.230280 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:42 crc kubenswrapper[4744]: W0930 03:15:42.242547 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeee68e6_9bb5_4921_818f_ba3a72e29ee1.slice/crio-be16a18b82ae0ea5af53ddc8aec37ccaf64293d623ed9aaf130da338dca8a68c WatchSource:0}: Error finding container be16a18b82ae0ea5af53ddc8aec37ccaf64293d623ed9aaf130da338dca8a68c: Status 404 returned error can't find the container with id be16a18b82ae0ea5af53ddc8aec37ccaf64293d623ed9aaf130da338dca8a68c Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.394260 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eeee68e6-9bb5-4921-818f-ba3a72e29ee1","Type":"ContainerStarted","Data":"be16a18b82ae0ea5af53ddc8aec37ccaf64293d623ed9aaf130da338dca8a68c"} Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.417653 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.545758 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pr7xb"] Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.546849 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.550991 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.551548 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.571184 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pr7xb"] Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.700062 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.737498 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l4jx\" (UniqueName: \"kubernetes.io/projected/b286f88e-e3f3-4730-b831-4db33fb09a99-kube-api-access-9l4jx\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.737574 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.737627 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-config-data\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.737676 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-scripts\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.840093 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-config-data\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.840200 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-scripts\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.840341 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l4jx\" (UniqueName: \"kubernetes.io/projected/b286f88e-e3f3-4730-b831-4db33fb09a99-kube-api-access-9l4jx\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.840411 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.844020 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-scripts\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.844319 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.851056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-config-data\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.857582 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l4jx\" (UniqueName: \"kubernetes.io/projected/b286f88e-e3f3-4730-b831-4db33fb09a99-kube-api-access-9l4jx\") pod \"nova-cell1-cell-mapping-pr7xb\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:42 crc kubenswrapper[4744]: I0930 03:15:42.928840 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.389535 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pr7xb"] Sep 30 03:15:43 crc kubenswrapper[4744]: W0930 03:15:43.395383 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb286f88e_e3f3_4730_b831_4db33fb09a99.slice/crio-df702361277fa448ed77c4cf2ffd15e1a77de22263e1a497ae8e778b9b67dac9 WatchSource:0}: Error finding container df702361277fa448ed77c4cf2ffd15e1a77de22263e1a497ae8e778b9b67dac9: Status 404 returned error can't find the container with id df702361277fa448ed77c4cf2ffd15e1a77de22263e1a497ae8e778b9b67dac9 Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.410907 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eeee68e6-9bb5-4921-818f-ba3a72e29ee1","Type":"ContainerStarted","Data":"83f85c7d9ffca4179a51b1746a27f3cc817454ba176d06c15c2850e1eaa8d33f"} Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.410961 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eeee68e6-9bb5-4921-818f-ba3a72e29ee1","Type":"ContainerStarted","Data":"e07dd89cc8e93e9a7e605ffb206b57b726361eedeb26262a617fb201d897c707"} Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.413447 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2be999-3323-4e60-b44e-641418e67b04","Type":"ContainerStarted","Data":"06a2689c0760b1050a32226ca64d40f69056470513ca3e3a3ce50564f7753fad"} Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.413489 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2be999-3323-4e60-b44e-641418e67b04","Type":"ContainerStarted","Data":"4c9dd6bfc2c8234b7f129a3ffc0aaf909dd486d422a1b0d58e5387c9057333dc"} Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.414845 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pr7xb" event={"ID":"b286f88e-e3f3-4730-b831-4db33fb09a99","Type":"ContainerStarted","Data":"df702361277fa448ed77c4cf2ffd15e1a77de22263e1a497ae8e778b9b67dac9"} Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.436226 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.436200791 podStartE2EDuration="2.436200791s" podCreationTimestamp="2025-09-30 03:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:43.426237192 +0000 UTC m=+1270.599457186" watchObservedRunningTime="2025-09-30 03:15:43.436200791 +0000 UTC m=+1270.609420775" Sep 30 03:15:43 crc kubenswrapper[4744]: I0930 03:15:43.534782 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45dcef2c-67a3-462d-8039-848684f60ede" path="/var/lib/kubelet/pods/45dcef2c-67a3-462d-8039-848684f60ede/volumes" Sep 30 03:15:44 crc kubenswrapper[4744]: I0930 03:15:44.425274 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pr7xb" event={"ID":"b286f88e-e3f3-4730-b831-4db33fb09a99","Type":"ContainerStarted","Data":"c092655f713849f89592150967af91bb962be8d9d5379b07efc46fb997db13a3"} Sep 30 03:15:44 crc kubenswrapper[4744]: I0930 03:15:44.427175 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2be999-3323-4e60-b44e-641418e67b04","Type":"ContainerStarted","Data":"f32cd61a90e52e4fcccd6b312eef332740b9d9d009bbed05d4d8c136812f0df4"} Sep 30 03:15:44 crc kubenswrapper[4744]: I0930 03:15:44.441564 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pr7xb" podStartSLOduration=2.441545879 podStartE2EDuration="2.441545879s" podCreationTimestamp="2025-09-30 03:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:44.440221348 +0000 UTC m=+1271.613441332" watchObservedRunningTime="2025-09-30 03:15:44.441545879 +0000 UTC m=+1271.614765853" Sep 30 03:15:44 crc kubenswrapper[4744]: I0930 03:15:44.864548 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:15:44 crc kubenswrapper[4744]: I0930 03:15:44.947335 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-b86jr"] Sep 30 03:15:44 crc kubenswrapper[4744]: I0930 03:15:44.947577 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" podUID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerName="dnsmasq-dns" containerID="cri-o://b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d" gracePeriod=10 Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.429007 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.435855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2be999-3323-4e60-b44e-641418e67b04","Type":"ContainerStarted","Data":"f76ced7b9aa614809b659c8aaaf00c651971ffa69ce283eba82828814bf7ba6a"} Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.438336 4744 generic.go:334] "Generic (PLEG): container finished" podID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerID="b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d" exitCode=0 Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.438444 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.438470 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" event={"ID":"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65","Type":"ContainerDied","Data":"b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d"} Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.438491 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-b86jr" event={"ID":"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65","Type":"ContainerDied","Data":"3ba3040650919edc2ba71578f448fe8a5d160a216149898e3b646c9e988ee366"} Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.438506 4744 scope.go:117] "RemoveContainer" containerID="b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.460436 4744 scope.go:117] "RemoveContainer" containerID="ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.497752 4744 scope.go:117] "RemoveContainer" containerID="b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d" Sep 30 03:15:45 crc kubenswrapper[4744]: E0930 03:15:45.499020 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d\": container with ID starting with b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d not found: ID does not exist" containerID="b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.499056 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d"} err="failed to get container status \"b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d\": rpc error: code = NotFound desc = could not find container \"b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d\": container with ID starting with b3564aa18d524838d44b2a8445f784167b37af27aeb096d1d5006e466386a53d not found: ID does not exist" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.499077 4744 scope.go:117] "RemoveContainer" containerID="ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956" Sep 30 03:15:45 crc kubenswrapper[4744]: E0930 03:15:45.499339 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956\": container with ID starting with ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956 not found: ID does not exist" containerID="ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.499393 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956"} err="failed to get container status \"ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956\": rpc error: code = NotFound desc = could not find container \"ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956\": container with ID starting with ecb40c5c9ac2ace9163e60284482890df625c88f4422c7e23798dc4027872956 not found: ID does not exist" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.594275 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-sb\") pod \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.594425 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-config\") pod \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.594462 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dctv8\" (UniqueName: \"kubernetes.io/projected/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-kube-api-access-dctv8\") pod \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.594483 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-nb\") pod \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.594654 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-swift-storage-0\") pod \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.594684 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-svc\") pod \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\" (UID: \"a9a0bae2-8e99-4164-9b56-e7bdaa5cde65\") " Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.599031 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-kube-api-access-dctv8" (OuterVolumeSpecName: "kube-api-access-dctv8") pod "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" (UID: "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65"). InnerVolumeSpecName "kube-api-access-dctv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.663183 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-config" (OuterVolumeSpecName: "config") pod "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" (UID: "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.665736 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" (UID: "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.668006 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" (UID: "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.688722 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" (UID: "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.697245 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.697286 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.697298 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dctv8\" (UniqueName: \"kubernetes.io/projected/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-kube-api-access-dctv8\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.697313 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.697323 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.704880 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" (UID: "a9a0bae2-8e99-4164-9b56-e7bdaa5cde65"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.767028 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-b86jr"] Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.778719 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-b86jr"] Sep 30 03:15:45 crc kubenswrapper[4744]: I0930 03:15:45.799085 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:46 crc kubenswrapper[4744]: I0930 03:15:46.450912 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea2be999-3323-4e60-b44e-641418e67b04","Type":"ContainerStarted","Data":"aa4159908f9efa069b5344aaab8a048928912b41c352f6cc4bff8ec289125dc2"} Sep 30 03:15:46 crc kubenswrapper[4744]: I0930 03:15:46.451405 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 03:15:46 crc kubenswrapper[4744]: I0930 03:15:46.476956 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.240419235 podStartE2EDuration="5.476939951s" podCreationTimestamp="2025-09-30 03:15:41 +0000 UTC" firstStartedPulling="2025-09-30 03:15:42.725476316 +0000 UTC m=+1269.898696280" lastFinishedPulling="2025-09-30 03:15:45.961997022 +0000 UTC m=+1273.135216996" observedRunningTime="2025-09-30 03:15:46.469922483 +0000 UTC m=+1273.643142457" watchObservedRunningTime="2025-09-30 03:15:46.476939951 +0000 UTC m=+1273.650159925" Sep 30 03:15:47 crc kubenswrapper[4744]: I0930 03:15:47.523672 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" path="/var/lib/kubelet/pods/a9a0bae2-8e99-4164-9b56-e7bdaa5cde65/volumes" Sep 30 03:15:48 crc kubenswrapper[4744]: I0930 03:15:48.482763 4744 generic.go:334] "Generic (PLEG): container finished" podID="b286f88e-e3f3-4730-b831-4db33fb09a99" containerID="c092655f713849f89592150967af91bb962be8d9d5379b07efc46fb997db13a3" exitCode=0 Sep 30 03:15:48 crc kubenswrapper[4744]: I0930 03:15:48.482810 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pr7xb" event={"ID":"b286f88e-e3f3-4730-b831-4db33fb09a99","Type":"ContainerDied","Data":"c092655f713849f89592150967af91bb962be8d9d5379b07efc46fb997db13a3"} Sep 30 03:15:49 crc kubenswrapper[4744]: I0930 03:15:49.924999 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:49 crc kubenswrapper[4744]: I0930 03:15:49.988053 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-combined-ca-bundle\") pod \"b286f88e-e3f3-4730-b831-4db33fb09a99\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " Sep 30 03:15:49 crc kubenswrapper[4744]: I0930 03:15:49.988164 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-scripts\") pod \"b286f88e-e3f3-4730-b831-4db33fb09a99\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " Sep 30 03:15:49 crc kubenswrapper[4744]: I0930 03:15:49.988251 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-config-data\") pod \"b286f88e-e3f3-4730-b831-4db33fb09a99\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " Sep 30 03:15:49 crc kubenswrapper[4744]: I0930 03:15:49.988410 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l4jx\" (UniqueName: \"kubernetes.io/projected/b286f88e-e3f3-4730-b831-4db33fb09a99-kube-api-access-9l4jx\") pod \"b286f88e-e3f3-4730-b831-4db33fb09a99\" (UID: \"b286f88e-e3f3-4730-b831-4db33fb09a99\") " Sep 30 03:15:49 crc kubenswrapper[4744]: I0930 03:15:49.998014 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-scripts" (OuterVolumeSpecName: "scripts") pod "b286f88e-e3f3-4730-b831-4db33fb09a99" (UID: "b286f88e-e3f3-4730-b831-4db33fb09a99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:49 crc kubenswrapper[4744]: I0930 03:15:49.998066 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b286f88e-e3f3-4730-b831-4db33fb09a99-kube-api-access-9l4jx" (OuterVolumeSpecName: "kube-api-access-9l4jx") pod "b286f88e-e3f3-4730-b831-4db33fb09a99" (UID: "b286f88e-e3f3-4730-b831-4db33fb09a99"). InnerVolumeSpecName "kube-api-access-9l4jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.019680 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-config-data" (OuterVolumeSpecName: "config-data") pod "b286f88e-e3f3-4730-b831-4db33fb09a99" (UID: "b286f88e-e3f3-4730-b831-4db33fb09a99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.048774 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b286f88e-e3f3-4730-b831-4db33fb09a99" (UID: "b286f88e-e3f3-4730-b831-4db33fb09a99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.091462 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.091496 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l4jx\" (UniqueName: \"kubernetes.io/projected/b286f88e-e3f3-4730-b831-4db33fb09a99-kube-api-access-9l4jx\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.091506 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.091515 4744 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b286f88e-e3f3-4730-b831-4db33fb09a99-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.514652 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pr7xb" event={"ID":"b286f88e-e3f3-4730-b831-4db33fb09a99","Type":"ContainerDied","Data":"df702361277fa448ed77c4cf2ffd15e1a77de22263e1a497ae8e778b9b67dac9"} Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.514973 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df702361277fa448ed77c4cf2ffd15e1a77de22263e1a497ae8e778b9b67dac9" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.515040 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pr7xb" Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.823762 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.824115 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-log" containerID="cri-o://e07dd89cc8e93e9a7e605ffb206b57b726361eedeb26262a617fb201d897c707" gracePeriod=30 Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.824426 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-api" containerID="cri-o://83f85c7d9ffca4179a51b1746a27f3cc817454ba176d06c15c2850e1eaa8d33f" gracePeriod=30 Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.845794 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.846085 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="db5bd596-1d8b-418b-9239-8ff206093f2d" containerName="nova-scheduler-scheduler" containerID="cri-o://8d0c4b9649d615dbb39d654e2ab22eaacba7914f38569201a69bf2ab1f0eb241" gracePeriod=30 Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.874779 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.875031 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-log" containerID="cri-o://c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c" gracePeriod=30 Sep 30 03:15:50 crc kubenswrapper[4744]: I0930 03:15:50.875174 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-metadata" containerID="cri-o://ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41" gracePeriod=30 Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.524549 4744 generic.go:334] "Generic (PLEG): container finished" podID="db5bd596-1d8b-418b-9239-8ff206093f2d" containerID="8d0c4b9649d615dbb39d654e2ab22eaacba7914f38569201a69bf2ab1f0eb241" exitCode=0 Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.524578 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"db5bd596-1d8b-418b-9239-8ff206093f2d","Type":"ContainerDied","Data":"8d0c4b9649d615dbb39d654e2ab22eaacba7914f38569201a69bf2ab1f0eb241"} Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.527313 4744 generic.go:334] "Generic (PLEG): container finished" podID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerID="c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c" exitCode=143 Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.527422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e505a047-5448-43a5-8d2c-cc5bdb4db4ee","Type":"ContainerDied","Data":"c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c"} Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.529299 4744 generic.go:334] "Generic (PLEG): container finished" podID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerID="83f85c7d9ffca4179a51b1746a27f3cc817454ba176d06c15c2850e1eaa8d33f" exitCode=0 Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.529314 4744 generic.go:334] "Generic (PLEG): container finished" podID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerID="e07dd89cc8e93e9a7e605ffb206b57b726361eedeb26262a617fb201d897c707" exitCode=143 Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.529476 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eeee68e6-9bb5-4921-818f-ba3a72e29ee1","Type":"ContainerDied","Data":"83f85c7d9ffca4179a51b1746a27f3cc817454ba176d06c15c2850e1eaa8d33f"} Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.529570 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eeee68e6-9bb5-4921-818f-ba3a72e29ee1","Type":"ContainerDied","Data":"e07dd89cc8e93e9a7e605ffb206b57b726361eedeb26262a617fb201d897c707"} Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.529642 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eeee68e6-9bb5-4921-818f-ba3a72e29ee1","Type":"ContainerDied","Data":"be16a18b82ae0ea5af53ddc8aec37ccaf64293d623ed9aaf130da338dca8a68c"} Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.529725 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be16a18b82ae0ea5af53ddc8aec37ccaf64293d623ed9aaf130da338dca8a68c" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.576272 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.724077 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-logs\") pod \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.724142 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-combined-ca-bundle\") pod \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.724209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-public-tls-certs\") pod \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.724253 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-config-data\") pod \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.724306 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-internal-tls-certs\") pod \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.724346 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmw9n\" (UniqueName: \"kubernetes.io/projected/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-kube-api-access-lmw9n\") pod \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\" (UID: \"eeee68e6-9bb5-4921-818f-ba3a72e29ee1\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.726494 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-logs" (OuterVolumeSpecName: "logs") pod "eeee68e6-9bb5-4921-818f-ba3a72e29ee1" (UID: "eeee68e6-9bb5-4921-818f-ba3a72e29ee1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.729801 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-kube-api-access-lmw9n" (OuterVolumeSpecName: "kube-api-access-lmw9n") pod "eeee68e6-9bb5-4921-818f-ba3a72e29ee1" (UID: "eeee68e6-9bb5-4921-818f-ba3a72e29ee1"). InnerVolumeSpecName "kube-api-access-lmw9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.757873 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.760463 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eeee68e6-9bb5-4921-818f-ba3a72e29ee1" (UID: "eeee68e6-9bb5-4921-818f-ba3a72e29ee1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.766826 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-config-data" (OuterVolumeSpecName: "config-data") pod "eeee68e6-9bb5-4921-818f-ba3a72e29ee1" (UID: "eeee68e6-9bb5-4921-818f-ba3a72e29ee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.799327 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eeee68e6-9bb5-4921-818f-ba3a72e29ee1" (UID: "eeee68e6-9bb5-4921-818f-ba3a72e29ee1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.800975 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eeee68e6-9bb5-4921-818f-ba3a72e29ee1" (UID: "eeee68e6-9bb5-4921-818f-ba3a72e29ee1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.828392 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-config-data\") pod \"db5bd596-1d8b-418b-9239-8ff206093f2d\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.828461 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-combined-ca-bundle\") pod \"db5bd596-1d8b-418b-9239-8ff206093f2d\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.829611 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk5cn\" (UniqueName: \"kubernetes.io/projected/db5bd596-1d8b-418b-9239-8ff206093f2d-kube-api-access-xk5cn\") pod \"db5bd596-1d8b-418b-9239-8ff206093f2d\" (UID: \"db5bd596-1d8b-418b-9239-8ff206093f2d\") " Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.830128 4744 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.830145 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmw9n\" (UniqueName: \"kubernetes.io/projected/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-kube-api-access-lmw9n\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.830155 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.830164 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.830174 4744 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.830183 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eeee68e6-9bb5-4921-818f-ba3a72e29ee1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.832728 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5bd596-1d8b-418b-9239-8ff206093f2d-kube-api-access-xk5cn" (OuterVolumeSpecName: "kube-api-access-xk5cn") pod "db5bd596-1d8b-418b-9239-8ff206093f2d" (UID: "db5bd596-1d8b-418b-9239-8ff206093f2d"). InnerVolumeSpecName "kube-api-access-xk5cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.852677 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-config-data" (OuterVolumeSpecName: "config-data") pod "db5bd596-1d8b-418b-9239-8ff206093f2d" (UID: "db5bd596-1d8b-418b-9239-8ff206093f2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.871785 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db5bd596-1d8b-418b-9239-8ff206093f2d" (UID: "db5bd596-1d8b-418b-9239-8ff206093f2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.932035 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.932274 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5bd596-1d8b-418b-9239-8ff206093f2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:51 crc kubenswrapper[4744]: I0930 03:15:51.932286 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk5cn\" (UniqueName: \"kubernetes.io/projected/db5bd596-1d8b-418b-9239-8ff206093f2d-kube-api-access-xk5cn\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.545544 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.545616 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"db5bd596-1d8b-418b-9239-8ff206093f2d","Type":"ContainerDied","Data":"3cd6928aaed68eaf444aaa9bfc5af4bf0c3a523a2e040da08f7d730bb8e95a4a"} Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.545712 4744 scope.go:117] "RemoveContainer" containerID="8d0c4b9649d615dbb39d654e2ab22eaacba7914f38569201a69bf2ab1f0eb241" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.551502 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.615006 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.632834 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.685189 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.709214 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.717000 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: E0930 03:15:52.717558 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b286f88e-e3f3-4730-b831-4db33fb09a99" containerName="nova-manage" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.717581 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b286f88e-e3f3-4730-b831-4db33fb09a99" containerName="nova-manage" Sep 30 03:15:52 crc kubenswrapper[4744]: E0930 03:15:52.717607 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-api" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.717615 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-api" Sep 30 03:15:52 crc kubenswrapper[4744]: E0930 03:15:52.717634 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerName="init" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.717642 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerName="init" Sep 30 03:15:52 crc kubenswrapper[4744]: E0930 03:15:52.717655 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerName="dnsmasq-dns" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.717663 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerName="dnsmasq-dns" Sep 30 03:15:52 crc kubenswrapper[4744]: E0930 03:15:52.717680 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-log" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.717688 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-log" Sep 30 03:15:52 crc kubenswrapper[4744]: E0930 03:15:52.717743 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5bd596-1d8b-418b-9239-8ff206093f2d" containerName="nova-scheduler-scheduler" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.717790 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5bd596-1d8b-418b-9239-8ff206093f2d" containerName="nova-scheduler-scheduler" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.718014 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-log" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.718033 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5bd596-1d8b-418b-9239-8ff206093f2d" containerName="nova-scheduler-scheduler" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.718054 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b286f88e-e3f3-4730-b831-4db33fb09a99" containerName="nova-manage" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.718068 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" containerName="nova-api-api" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.718084 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a0bae2-8e99-4164-9b56-e7bdaa5cde65" containerName="dnsmasq-dns" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.720170 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.724040 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.724116 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.724124 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.725307 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.734735 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.741123 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.743307 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.746913 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.855469 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-config-data\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.855567 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.855637 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwgss\" (UniqueName: \"kubernetes.io/projected/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-kube-api-access-dwgss\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.855685 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-logs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.855740 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gp7\" (UniqueName: \"kubernetes.io/projected/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-kube-api-access-45gp7\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.855768 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.855968 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-config-data\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.856058 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.856113 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-public-tls-certs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958144 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-config-data\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958199 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958232 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwgss\" (UniqueName: \"kubernetes.io/projected/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-kube-api-access-dwgss\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958261 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-logs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958285 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gp7\" (UniqueName: \"kubernetes.io/projected/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-kube-api-access-45gp7\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958333 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-config-data\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958354 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.958401 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-public-tls-certs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.959211 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-logs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.965805 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-public-tls-certs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.965924 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-config-data\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.966422 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-config-data\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.966516 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.968215 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.980700 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:52 crc kubenswrapper[4744]: I0930 03:15:52.996428 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gp7\" (UniqueName: \"kubernetes.io/projected/5a7ff737-dbb5-4e5c-9862-6b99f8584fc4-kube-api-access-45gp7\") pod \"nova-api-0\" (UID: \"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4\") " pod="openstack/nova-api-0" Sep 30 03:15:53 crc kubenswrapper[4744]: I0930 03:15:53.001095 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwgss\" (UniqueName: \"kubernetes.io/projected/897dfc3d-b2fa-4a22-b5a9-e2ce2c486801-kube-api-access-dwgss\") pod \"nova-scheduler-0\" (UID: \"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801\") " pod="openstack/nova-scheduler-0" Sep 30 03:15:53 crc kubenswrapper[4744]: I0930 03:15:53.046563 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 03:15:53 crc kubenswrapper[4744]: I0930 03:15:53.063952 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 03:15:53 crc kubenswrapper[4744]: I0930 03:15:53.517508 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5bd596-1d8b-418b-9239-8ff206093f2d" path="/var/lib/kubelet/pods/db5bd596-1d8b-418b-9239-8ff206093f2d/volumes" Sep 30 03:15:53 crc kubenswrapper[4744]: I0930 03:15:53.519093 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeee68e6-9bb5-4921-818f-ba3a72e29ee1" path="/var/lib/kubelet/pods/eeee68e6-9bb5-4921-818f-ba3a72e29ee1/volumes" Sep 30 03:15:53 crc kubenswrapper[4744]: I0930 03:15:53.639238 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 03:15:53 crc kubenswrapper[4744]: W0930 03:15:53.659719 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a7ff737_dbb5_4e5c_9862_6b99f8584fc4.slice/crio-2193cef7f1ce592282aafed45b33ef5e54237e99eb546634ca3dbcec32c517f5 WatchSource:0}: Error finding container 2193cef7f1ce592282aafed45b33ef5e54237e99eb546634ca3dbcec32c517f5: Status 404 returned error can't find the container with id 2193cef7f1ce592282aafed45b33ef5e54237e99eb546634ca3dbcec32c517f5 Sep 30 03:15:53 crc kubenswrapper[4744]: I0930 03:15:53.706425 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 03:15:53 crc kubenswrapper[4744]: W0930 03:15:53.711599 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod897dfc3d_b2fa_4a22_b5a9_e2ce2c486801.slice/crio-b34aed78d77d5442b74b1361698e5c1cdd689865f1e81c79371d909f90aee6dc WatchSource:0}: Error finding container b34aed78d77d5442b74b1361698e5c1cdd689865f1e81c79371d909f90aee6dc: Status 404 returned error can't find the container with id b34aed78d77d5442b74b1361698e5c1cdd689865f1e81c79371d909f90aee6dc Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.025779 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:51354->10.217.0.210:8775: read: connection reset by peer" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.025779 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:51352->10.217.0.210:8775: read: connection reset by peer" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.505948 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.569348 4744 generic.go:334] "Generic (PLEG): container finished" podID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerID="ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41" exitCode=0 Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.569407 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.569427 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e505a047-5448-43a5-8d2c-cc5bdb4db4ee","Type":"ContainerDied","Data":"ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41"} Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.569699 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e505a047-5448-43a5-8d2c-cc5bdb4db4ee","Type":"ContainerDied","Data":"aa1d7844006ba51fc4e4c2bf449c76ba34c83d24060e87d962b432b945f738d6"} Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.569739 4744 scope.go:117] "RemoveContainer" containerID="ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.572778 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4","Type":"ContainerStarted","Data":"52501624fd9dc9b991f2ff9bb9137c36a42149e6f20b48b9cc54f32899c3f45d"} Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.573078 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4","Type":"ContainerStarted","Data":"edb064d87cb8b41ff271bb029f773c4c268e7521196e25af74f554196eb454de"} Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.573096 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a7ff737-dbb5-4e5c-9862-6b99f8584fc4","Type":"ContainerStarted","Data":"2193cef7f1ce592282aafed45b33ef5e54237e99eb546634ca3dbcec32c517f5"} Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.576039 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801","Type":"ContainerStarted","Data":"100ad76ab0885d9db8fb379d818b60cc817a7d9a1c2798a539e0f0131daa2c8f"} Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.576068 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"897dfc3d-b2fa-4a22-b5a9-e2ce2c486801","Type":"ContainerStarted","Data":"b34aed78d77d5442b74b1361698e5c1cdd689865f1e81c79371d909f90aee6dc"} Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.601098 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6010780970000003 podStartE2EDuration="2.601078097s" podCreationTimestamp="2025-09-30 03:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:54.59312162 +0000 UTC m=+1281.766341604" watchObservedRunningTime="2025-09-30 03:15:54.601078097 +0000 UTC m=+1281.774298091" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.602703 4744 scope.go:117] "RemoveContainer" containerID="c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.609083 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l5c2\" (UniqueName: \"kubernetes.io/projected/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-kube-api-access-5l5c2\") pod \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.609150 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-logs\") pod \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.609293 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-combined-ca-bundle\") pod \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.609348 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-nova-metadata-tls-certs\") pod \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.609401 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-config-data\") pod \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\" (UID: \"e505a047-5448-43a5-8d2c-cc5bdb4db4ee\") " Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.615119 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-logs" (OuterVolumeSpecName: "logs") pod "e505a047-5448-43a5-8d2c-cc5bdb4db4ee" (UID: "e505a047-5448-43a5-8d2c-cc5bdb4db4ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.616769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-kube-api-access-5l5c2" (OuterVolumeSpecName: "kube-api-access-5l5c2") pod "e505a047-5448-43a5-8d2c-cc5bdb4db4ee" (UID: "e505a047-5448-43a5-8d2c-cc5bdb4db4ee"). InnerVolumeSpecName "kube-api-access-5l5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.621560 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6215456919999998 podStartE2EDuration="2.621545692s" podCreationTimestamp="2025-09-30 03:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:54.614541805 +0000 UTC m=+1281.787761799" watchObservedRunningTime="2025-09-30 03:15:54.621545692 +0000 UTC m=+1281.794765666" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.644640 4744 scope.go:117] "RemoveContainer" containerID="ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.647997 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e505a047-5448-43a5-8d2c-cc5bdb4db4ee" (UID: "e505a047-5448-43a5-8d2c-cc5bdb4db4ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.649191 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-config-data" (OuterVolumeSpecName: "config-data") pod "e505a047-5448-43a5-8d2c-cc5bdb4db4ee" (UID: "e505a047-5448-43a5-8d2c-cc5bdb4db4ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:54 crc kubenswrapper[4744]: E0930 03:15:54.653492 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41\": container with ID starting with ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41 not found: ID does not exist" containerID="ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.653538 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41"} err="failed to get container status \"ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41\": rpc error: code = NotFound desc = could not find container \"ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41\": container with ID starting with ff916a379142290974c3ae33ddd74580e6d89bb4880e9142cf3730f93c6a5f41 not found: ID does not exist" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.653563 4744 scope.go:117] "RemoveContainer" containerID="c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c" Sep 30 03:15:54 crc kubenswrapper[4744]: E0930 03:15:54.656984 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c\": container with ID starting with c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c not found: ID does not exist" containerID="c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.657128 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c"} err="failed to get container status \"c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c\": rpc error: code = NotFound desc = could not find container \"c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c\": container with ID starting with c7cd2807458126d907358deda1359b12ad4c3db33549534dbcf442594420b84c not found: ID does not exist" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.671717 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e505a047-5448-43a5-8d2c-cc5bdb4db4ee" (UID: "e505a047-5448-43a5-8d2c-cc5bdb4db4ee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.713108 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l5c2\" (UniqueName: \"kubernetes.io/projected/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-kube-api-access-5l5c2\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.713158 4744 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-logs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.713175 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.713194 4744 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.713208 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e505a047-5448-43a5-8d2c-cc5bdb4db4ee-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.910962 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.926116 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.948799 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:54 crc kubenswrapper[4744]: E0930 03:15:54.949454 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-log" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.949481 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-log" Sep 30 03:15:54 crc kubenswrapper[4744]: E0930 03:15:54.949525 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-metadata" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.949538 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-metadata" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.949916 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-log" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.949975 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" containerName="nova-metadata-metadata" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.951710 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.953587 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.954712 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 03:15:54 crc kubenswrapper[4744]: I0930 03:15:54.979496 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.124848 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-config-data\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.125223 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4c2\" (UniqueName: \"kubernetes.io/projected/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-kube-api-access-8l4c2\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.125362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.125602 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.125842 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-logs\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.227154 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-config-data\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.227214 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4c2\" (UniqueName: \"kubernetes.io/projected/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-kube-api-access-8l4c2\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.227249 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.227303 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.227385 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-logs\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.227930 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-logs\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.231978 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.235796 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.235830 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-config-data\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.253324 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4c2\" (UniqueName: \"kubernetes.io/projected/d9aebe30-d132-461b-ad9b-fa6bc9f1227b-kube-api-access-8l4c2\") pod \"nova-metadata-0\" (UID: \"d9aebe30-d132-461b-ad9b-fa6bc9f1227b\") " pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.314098 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.517350 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e505a047-5448-43a5-8d2c-cc5bdb4db4ee" path="/var/lib/kubelet/pods/e505a047-5448-43a5-8d2c-cc5bdb4db4ee/volumes" Sep 30 03:15:55 crc kubenswrapper[4744]: W0930 03:15:55.766067 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9aebe30_d132_461b_ad9b_fa6bc9f1227b.slice/crio-5a2ae71159a11e5447bea6279731a0a8e2ade75220dd37a2b3cf362b02609a01 WatchSource:0}: Error finding container 5a2ae71159a11e5447bea6279731a0a8e2ade75220dd37a2b3cf362b02609a01: Status 404 returned error can't find the container with id 5a2ae71159a11e5447bea6279731a0a8e2ade75220dd37a2b3cf362b02609a01 Sep 30 03:15:55 crc kubenswrapper[4744]: I0930 03:15:55.770273 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 03:15:56 crc kubenswrapper[4744]: I0930 03:15:56.601294 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9aebe30-d132-461b-ad9b-fa6bc9f1227b","Type":"ContainerStarted","Data":"a270429ec6578ead38677767c451c56c64a3d682114c41102dce42391b3a3cd5"} Sep 30 03:15:56 crc kubenswrapper[4744]: I0930 03:15:56.601790 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9aebe30-d132-461b-ad9b-fa6bc9f1227b","Type":"ContainerStarted","Data":"b5498f69bd8991a482a17e8d6ca696bb3bb133c58d9b3c919fabfb79a6dc1a6b"} Sep 30 03:15:56 crc kubenswrapper[4744]: I0930 03:15:56.601827 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9aebe30-d132-461b-ad9b-fa6bc9f1227b","Type":"ContainerStarted","Data":"5a2ae71159a11e5447bea6279731a0a8e2ade75220dd37a2b3cf362b02609a01"} Sep 30 03:15:56 crc kubenswrapper[4744]: I0930 03:15:56.629541 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.629515123 podStartE2EDuration="2.629515123s" podCreationTimestamp="2025-09-30 03:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:15:56.623858218 +0000 UTC m=+1283.797078202" watchObservedRunningTime="2025-09-30 03:15:56.629515123 +0000 UTC m=+1283.802735107" Sep 30 03:15:58 crc kubenswrapper[4744]: I0930 03:15:58.064784 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 03:16:00 crc kubenswrapper[4744]: I0930 03:16:00.314778 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:16:00 crc kubenswrapper[4744]: I0930 03:16:00.315251 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 03:16:03 crc kubenswrapper[4744]: I0930 03:16:03.047004 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 03:16:03 crc kubenswrapper[4744]: I0930 03:16:03.047420 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 03:16:03 crc kubenswrapper[4744]: I0930 03:16:03.064268 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 03:16:03 crc kubenswrapper[4744]: I0930 03:16:03.116253 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 03:16:03 crc kubenswrapper[4744]: I0930 03:16:03.733478 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 03:16:04 crc kubenswrapper[4744]: I0930 03:16:04.063790 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a7ff737-dbb5-4e5c-9862-6b99f8584fc4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 03:16:04 crc kubenswrapper[4744]: I0930 03:16:04.064318 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a7ff737-dbb5-4e5c-9862-6b99f8584fc4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 03:16:04 crc kubenswrapper[4744]: I0930 03:16:04.347291 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:16:04 crc kubenswrapper[4744]: I0930 03:16:04.347341 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:16:05 crc kubenswrapper[4744]: I0930 03:16:05.315406 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 03:16:05 crc kubenswrapper[4744]: I0930 03:16:05.315655 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 03:16:06 crc kubenswrapper[4744]: I0930 03:16:06.333704 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9aebe30-d132-461b-ad9b-fa6bc9f1227b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 03:16:06 crc kubenswrapper[4744]: I0930 03:16:06.333742 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9aebe30-d132-461b-ad9b-fa6bc9f1227b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 03:16:12 crc kubenswrapper[4744]: I0930 03:16:12.206137 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 03:16:13 crc kubenswrapper[4744]: I0930 03:16:13.057203 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 03:16:13 crc kubenswrapper[4744]: I0930 03:16:13.058130 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 03:16:13 crc kubenswrapper[4744]: I0930 03:16:13.058184 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 03:16:13 crc kubenswrapper[4744]: I0930 03:16:13.067934 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 03:16:13 crc kubenswrapper[4744]: I0930 03:16:13.815516 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 03:16:13 crc kubenswrapper[4744]: I0930 03:16:13.830646 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 03:16:15 crc kubenswrapper[4744]: I0930 03:16:15.324585 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 03:16:15 crc kubenswrapper[4744]: I0930 03:16:15.335453 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 03:16:15 crc kubenswrapper[4744]: I0930 03:16:15.335550 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 03:16:15 crc kubenswrapper[4744]: I0930 03:16:15.843164 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 03:16:23 crc kubenswrapper[4744]: I0930 03:16:23.497104 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:16:24 crc kubenswrapper[4744]: I0930 03:16:24.363539 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:16:28 crc kubenswrapper[4744]: I0930 03:16:28.334358 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerName="rabbitmq" containerID="cri-o://247926f8f3cadac7f088a36ccd7fa8f1c667eaac56ae4420f2a8ec7c7c5b1c5b" gracePeriod=604796 Sep 30 03:16:28 crc kubenswrapper[4744]: I0930 03:16:28.967064 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerName="rabbitmq" containerID="cri-o://e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad" gracePeriod=604796 Sep 30 03:16:34 crc kubenswrapper[4744]: I0930 03:16:34.092148 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Sep 30 03:16:34 crc kubenswrapper[4744]: I0930 03:16:34.348360 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:16:34 crc kubenswrapper[4744]: I0930 03:16:34.348448 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:16:34 crc kubenswrapper[4744]: I0930 03:16:34.348502 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:16:34 crc kubenswrapper[4744]: I0930 03:16:34.349415 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7975d758249d48351e6b790ced251dcf0b3dce30fe61d2854cf2d73cc541951"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:16:34 crc kubenswrapper[4744]: I0930 03:16:34.349508 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://b7975d758249d48351e6b790ced251dcf0b3dce30fe61d2854cf2d73cc541951" gracePeriod=600 Sep 30 03:16:34 crc kubenswrapper[4744]: I0930 03:16:34.400821 4744 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.052022 4744 generic.go:334] "Generic (PLEG): container finished" podID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerID="247926f8f3cadac7f088a36ccd7fa8f1c667eaac56ae4420f2a8ec7c7c5b1c5b" exitCode=0 Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.052151 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79aeb9a3-f29e-49f0-af59-ae29868cc21e","Type":"ContainerDied","Data":"247926f8f3cadac7f088a36ccd7fa8f1c667eaac56ae4420f2a8ec7c7c5b1c5b"} Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.052574 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"79aeb9a3-f29e-49f0-af59-ae29868cc21e","Type":"ContainerDied","Data":"e092da75813a67de8bf2b242c3ec544d741c373cff63dcb25bf56cf34b2b81e5"} Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.052589 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e092da75813a67de8bf2b242c3ec544d741c373cff63dcb25bf56cf34b2b81e5" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.057967 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="b7975d758249d48351e6b790ced251dcf0b3dce30fe61d2854cf2d73cc541951" exitCode=0 Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.058012 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"b7975d758249d48351e6b790ced251dcf0b3dce30fe61d2854cf2d73cc541951"} Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.058083 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f"} Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.058105 4744 scope.go:117] "RemoveContainer" containerID="1d72e9221a902ba71a0038b939d0d12d57f148cf38a3a98c9981e273e6748a54" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.066482 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202302 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-server-conf\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202386 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-config-data\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202456 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-plugins-conf\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202500 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202529 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-tls\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202605 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-plugins\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202689 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79aeb9a3-f29e-49f0-af59-ae29868cc21e-pod-info\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202714 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-erlang-cookie\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202755 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-confd\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202778 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4vmc\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-kube-api-access-j4vmc\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.202797 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79aeb9a3-f29e-49f0-af59-ae29868cc21e-erlang-cookie-secret\") pod \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\" (UID: \"79aeb9a3-f29e-49f0-af59-ae29868cc21e\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.204255 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.204772 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.205229 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.211117 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-kube-api-access-j4vmc" (OuterVolumeSpecName: "kube-api-access-j4vmc") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "kube-api-access-j4vmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.213781 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.214476 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/79aeb9a3-f29e-49f0-af59-ae29868cc21e-pod-info" (OuterVolumeSpecName: "pod-info") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.217851 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.222620 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79aeb9a3-f29e-49f0-af59-ae29868cc21e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.259070 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-config-data" (OuterVolumeSpecName: "config-data") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.283627 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-server-conf" (OuterVolumeSpecName: "server-conf") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314128 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314166 4744 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79aeb9a3-f29e-49f0-af59-ae29868cc21e-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314179 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314194 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4vmc\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-kube-api-access-j4vmc\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314206 4744 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79aeb9a3-f29e-49f0-af59-ae29868cc21e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314219 4744 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314230 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314241 4744 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79aeb9a3-f29e-49f0-af59-ae29868cc21e-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314274 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.314287 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.340049 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.373653 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "79aeb9a3-f29e-49f0-af59-ae29868cc21e" (UID: "79aeb9a3-f29e-49f0-af59-ae29868cc21e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.415893 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.415921 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79aeb9a3-f29e-49f0-af59-ae29868cc21e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.629606 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720328 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-server-conf\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720591 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-tls\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720689 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2d0096-8154-4723-aa53-80eaeb9e4d32-pod-info\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720723 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720784 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-config-data\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720810 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-erlang-cookie\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720846 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-plugins\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720886 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-plugins-conf\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720934 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2d0096-8154-4723-aa53-80eaeb9e4d32-erlang-cookie-secret\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720958 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk2cp\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-kube-api-access-bk2cp\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.720988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-confd\") pod \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\" (UID: \"2d2d0096-8154-4723-aa53-80eaeb9e4d32\") " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.721713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.721872 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.721994 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.727556 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2d0096-8154-4723-aa53-80eaeb9e4d32-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.734617 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.736530 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2d2d0096-8154-4723-aa53-80eaeb9e4d32-pod-info" (OuterVolumeSpecName: "pod-info") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.736481 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-kube-api-access-bk2cp" (OuterVolumeSpecName: "kube-api-access-bk2cp") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "kube-api-access-bk2cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.742584 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.775800 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-config-data" (OuterVolumeSpecName: "config-data") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.786346 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-server-conf" (OuterVolumeSpecName: "server-conf") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826108 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826140 4744 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d2d0096-8154-4723-aa53-80eaeb9e4d32-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826175 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826193 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826220 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826232 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826244 4744 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826256 4744 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d2d0096-8154-4723-aa53-80eaeb9e4d32-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826266 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk2cp\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-kube-api-access-bk2cp\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.826277 4744 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d2d0096-8154-4723-aa53-80eaeb9e4d32-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.838348 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2d2d0096-8154-4723-aa53-80eaeb9e4d32" (UID: "2d2d0096-8154-4723-aa53-80eaeb9e4d32"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.853451 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.929155 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:35 crc kubenswrapper[4744]: I0930 03:16:35.929190 4744 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d2d0096-8154-4723-aa53-80eaeb9e4d32-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.067333 4744 generic.go:334] "Generic (PLEG): container finished" podID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerID="e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad" exitCode=0 Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.067415 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d2d0096-8154-4723-aa53-80eaeb9e4d32","Type":"ContainerDied","Data":"e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad"} Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.067446 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2d2d0096-8154-4723-aa53-80eaeb9e4d32","Type":"ContainerDied","Data":"2cc09e64b0a3ccbdde1191ac21f5d7f6462260e6dad9c19bcb2bdac313ff7b62"} Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.067465 4744 scope.go:117] "RemoveContainer" containerID="e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.067553 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.081507 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.129573 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.149962 4744 scope.go:117] "RemoveContainer" containerID="f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.178759 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.192459 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: E0930 03:16:36.193293 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerName="setup-container" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.193413 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerName="setup-container" Sep 30 03:16:36 crc kubenswrapper[4744]: E0930 03:16:36.193510 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerName="rabbitmq" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.193621 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerName="rabbitmq" Sep 30 03:16:36 crc kubenswrapper[4744]: E0930 03:16:36.193728 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerName="rabbitmq" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.193819 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerName="rabbitmq" Sep 30 03:16:36 crc kubenswrapper[4744]: E0930 03:16:36.193938 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerName="setup-container" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.194030 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerName="setup-container" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.194407 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" containerName="rabbitmq" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.194521 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" containerName="rabbitmq" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.196202 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.200697 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.200907 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.201074 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.202348 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.202559 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.202768 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.202958 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-z6w4x" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.203927 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.212447 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.220270 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.223930 4744 scope.go:117] "RemoveContainer" containerID="e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad" Sep 30 03:16:36 crc kubenswrapper[4744]: E0930 03:16:36.224314 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad\": container with ID starting with e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad not found: ID does not exist" containerID="e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.224359 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad"} err="failed to get container status \"e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad\": rpc error: code = NotFound desc = could not find container \"e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad\": container with ID starting with e543ba11a1bd1ec59744682566026951dfb96da350fea36b810e4924699137ad not found: ID does not exist" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.224400 4744 scope.go:117] "RemoveContainer" containerID="f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430" Sep 30 03:16:36 crc kubenswrapper[4744]: E0930 03:16:36.224649 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430\": container with ID starting with f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430 not found: ID does not exist" containerID="f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.224682 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430"} err="failed to get container status \"f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430\": rpc error: code = NotFound desc = could not find container \"f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430\": container with ID starting with f0e4e1160ae49b5e70c9234570be44e44a80217bc408a15383594ebd0372d430 not found: ID does not exist" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.235466 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.237341 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.240581 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.240734 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.240861 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9m667" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.241010 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.241109 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.241258 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.241362 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.251670 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.356972 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86g8\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-kube-api-access-c86g8\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357034 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357071 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357102 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357120 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357141 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357182 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357200 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357214 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357237 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/341a2cff-5aae-4952-a8d8-64d5e247d7f9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357273 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357291 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357313 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-config-data\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357344 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357362 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357407 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357431 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357451 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357484 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmz9\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-kube-api-access-2mmz9\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357504 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357523 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/341a2cff-5aae-4952-a8d8-64d5e247d7f9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.357565 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459360 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459693 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459718 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459742 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmz9\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-kube-api-access-2mmz9\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459759 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459781 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/341a2cff-5aae-4952-a8d8-64d5e247d7f9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459813 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459865 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c86g8\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-kube-api-access-c86g8\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459889 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459956 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.459982 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460007 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460039 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460058 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460078 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460108 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/341a2cff-5aae-4952-a8d8-64d5e247d7f9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460129 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460147 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460165 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-config-data\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460182 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.460198 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.461510 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.461785 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.461997 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.462461 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.465558 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.465772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.466665 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.466678 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.466812 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-config-data\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.467128 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.467138 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.467746 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/341a2cff-5aae-4952-a8d8-64d5e247d7f9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.468407 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/341a2cff-5aae-4952-a8d8-64d5e247d7f9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.471522 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.472469 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.473009 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/341a2cff-5aae-4952-a8d8-64d5e247d7f9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.473415 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.479307 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.481284 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.489826 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.493675 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmz9\" (UniqueName: \"kubernetes.io/projected/7d180fc4-3fb0-4db5-99d7-913559d8ec2e-kube-api-access-2mmz9\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.504804 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86g8\" (UniqueName: \"kubernetes.io/projected/341a2cff-5aae-4952-a8d8-64d5e247d7f9-kube-api-access-c86g8\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.504928 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"341a2cff-5aae-4952-a8d8-64d5e247d7f9\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.511757 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"7d180fc4-3fb0-4db5-99d7-913559d8ec2e\") " pod="openstack/rabbitmq-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.532621 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:16:36 crc kubenswrapper[4744]: I0930 03:16:36.568428 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.032165 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.096760 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"341a2cff-5aae-4952-a8d8-64d5e247d7f9","Type":"ContainerStarted","Data":"c98d43ed9197de08317815d5f063bc325e538986ca754ff1166c57536d186261"} Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.173548 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.321332 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759799d765-8qbg5"] Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.323427 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.325427 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.339718 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759799d765-8qbg5"] Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.487512 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-sb\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.487836 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2l2x\" (UniqueName: \"kubernetes.io/projected/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-kube-api-access-p2l2x\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.487875 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-svc\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.487912 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-config\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.487940 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-nb\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.487983 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-swift-storage-0\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.488026 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-openstack-edpm-ipam\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.514648 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2d0096-8154-4723-aa53-80eaeb9e4d32" path="/var/lib/kubelet/pods/2d2d0096-8154-4723-aa53-80eaeb9e4d32/volumes" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.515882 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79aeb9a3-f29e-49f0-af59-ae29868cc21e" path="/var/lib/kubelet/pods/79aeb9a3-f29e-49f0-af59-ae29868cc21e/volumes" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.589329 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-svc\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.589407 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-config\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.589440 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-nb\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.589481 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-swift-storage-0\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.589526 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-openstack-edpm-ipam\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.589555 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-sb\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.589608 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2l2x\" (UniqueName: \"kubernetes.io/projected/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-kube-api-access-p2l2x\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.590227 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-svc\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.590437 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-config\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.590930 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-nb\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.591039 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-openstack-edpm-ipam\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.591056 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-sb\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.591633 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-swift-storage-0\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.613451 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2l2x\" (UniqueName: \"kubernetes.io/projected/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-kube-api-access-p2l2x\") pod \"dnsmasq-dns-759799d765-8qbg5\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:37 crc kubenswrapper[4744]: I0930 03:16:37.655562 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:38 crc kubenswrapper[4744]: I0930 03:16:38.110759 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"341a2cff-5aae-4952-a8d8-64d5e247d7f9","Type":"ContainerStarted","Data":"6f982afaada728532abab523248fce704daba34b59b8e54f85f542b475b6ecf2"} Sep 30 03:16:38 crc kubenswrapper[4744]: I0930 03:16:38.112954 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7d180fc4-3fb0-4db5-99d7-913559d8ec2e","Type":"ContainerStarted","Data":"6e01dced2ff7ee2e3f41809bbf3612ee8a80c7725fa61665fd439d042c78ec2e"} Sep 30 03:16:38 crc kubenswrapper[4744]: I0930 03:16:38.112991 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7d180fc4-3fb0-4db5-99d7-913559d8ec2e","Type":"ContainerStarted","Data":"89669e5c8377f54f4a33250f857253126e80af85ba5c32d35771def6f28247f0"} Sep 30 03:16:38 crc kubenswrapper[4744]: I0930 03:16:38.117920 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759799d765-8qbg5"] Sep 30 03:16:38 crc kubenswrapper[4744]: W0930 03:16:38.123875 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e077c18_78cc_4f7d_b7d0_8ed5adc31a29.slice/crio-d210a8bb143d18a6553e035ffb0ccb3eecab77ec8bb38306c0f3781fc412af08 WatchSource:0}: Error finding container d210a8bb143d18a6553e035ffb0ccb3eecab77ec8bb38306c0f3781fc412af08: Status 404 returned error can't find the container with id d210a8bb143d18a6553e035ffb0ccb3eecab77ec8bb38306c0f3781fc412af08 Sep 30 03:16:39 crc kubenswrapper[4744]: I0930 03:16:39.136727 4744 generic.go:334] "Generic (PLEG): container finished" podID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerID="6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5" exitCode=0 Sep 30 03:16:39 crc kubenswrapper[4744]: I0930 03:16:39.138348 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-8qbg5" event={"ID":"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29","Type":"ContainerDied","Data":"6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5"} Sep 30 03:16:39 crc kubenswrapper[4744]: I0930 03:16:39.138453 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-8qbg5" event={"ID":"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29","Type":"ContainerStarted","Data":"d210a8bb143d18a6553e035ffb0ccb3eecab77ec8bb38306c0f3781fc412af08"} Sep 30 03:16:40 crc kubenswrapper[4744]: I0930 03:16:40.155009 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-8qbg5" event={"ID":"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29","Type":"ContainerStarted","Data":"d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49"} Sep 30 03:16:40 crc kubenswrapper[4744]: I0930 03:16:40.155833 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:40 crc kubenswrapper[4744]: I0930 03:16:40.192539 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759799d765-8qbg5" podStartSLOduration=3.192514098 podStartE2EDuration="3.192514098s" podCreationTimestamp="2025-09-30 03:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:16:40.180868427 +0000 UTC m=+1327.354088391" watchObservedRunningTime="2025-09-30 03:16:40.192514098 +0000 UTC m=+1327.365734112" Sep 30 03:16:47 crc kubenswrapper[4744]: I0930 03:16:47.656581 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:47 crc kubenswrapper[4744]: I0930 03:16:47.727630 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-s5klg"] Sep 30 03:16:47 crc kubenswrapper[4744]: I0930 03:16:47.727881 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" podUID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerName="dnsmasq-dns" containerID="cri-o://8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a" gracePeriod=10 Sep 30 03:16:47 crc kubenswrapper[4744]: I0930 03:16:47.921712 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bb847fbb7-2fpz5"] Sep 30 03:16:47 crc kubenswrapper[4744]: I0930 03:16:47.923580 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:47 crc kubenswrapper[4744]: I0930 03:16:47.931947 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb847fbb7-2fpz5"] Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.032655 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qccm9\" (UniqueName: \"kubernetes.io/projected/166b326f-c29c-48e9-b017-034c02b4d448-kube-api-access-qccm9\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.032763 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.032799 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-dns-svc\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.032816 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-config\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.032918 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.032950 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.032978 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.134117 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.134178 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.134210 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qccm9\" (UniqueName: \"kubernetes.io/projected/166b326f-c29c-48e9-b017-034c02b4d448-kube-api-access-qccm9\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.135456 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.135786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.135924 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-dns-svc\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.135966 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-config\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.136115 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.136149 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.136628 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-dns-svc\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.137040 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-config\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.137946 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.137990 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/166b326f-c29c-48e9-b017-034c02b4d448-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.167541 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qccm9\" (UniqueName: \"kubernetes.io/projected/166b326f-c29c-48e9-b017-034c02b4d448-kube-api-access-qccm9\") pod \"dnsmasq-dns-5bb847fbb7-2fpz5\" (UID: \"166b326f-c29c-48e9-b017-034c02b4d448\") " pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.241344 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.252537 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.257181 4744 generic.go:334] "Generic (PLEG): container finished" podID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerID="8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a" exitCode=0 Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.257224 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" event={"ID":"a54a9e4e-1abf-4e10-a7cb-98c582b531fa","Type":"ContainerDied","Data":"8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a"} Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.257253 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" event={"ID":"a54a9e4e-1abf-4e10-a7cb-98c582b531fa","Type":"ContainerDied","Data":"d769b7b8b1ece59bf8c56d9cef226bd2d6faf4acb0af5014770ad222b7129a46"} Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.257278 4744 scope.go:117] "RemoveContainer" containerID="8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.257451 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-s5klg" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.345270 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-swift-storage-0\") pod \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.345422 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-nb\") pod \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.345478 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-svc\") pod \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.345546 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncmsf\" (UniqueName: \"kubernetes.io/projected/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-kube-api-access-ncmsf\") pod \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.345584 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-config\") pod \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.345660 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-sb\") pod \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\" (UID: \"a54a9e4e-1abf-4e10-a7cb-98c582b531fa\") " Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.362640 4744 scope.go:117] "RemoveContainer" containerID="98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.363157 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-kube-api-access-ncmsf" (OuterVolumeSpecName: "kube-api-access-ncmsf") pod "a54a9e4e-1abf-4e10-a7cb-98c582b531fa" (UID: "a54a9e4e-1abf-4e10-a7cb-98c582b531fa"). InnerVolumeSpecName "kube-api-access-ncmsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.398558 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a54a9e4e-1abf-4e10-a7cb-98c582b531fa" (UID: "a54a9e4e-1abf-4e10-a7cb-98c582b531fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.409188 4744 scope.go:117] "RemoveContainer" containerID="8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a" Sep 30 03:16:48 crc kubenswrapper[4744]: E0930 03:16:48.410317 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a\": container with ID starting with 8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a not found: ID does not exist" containerID="8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.410340 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a"} err="failed to get container status \"8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a\": rpc error: code = NotFound desc = could not find container \"8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a\": container with ID starting with 8f04339f89cae14139d264a877e768cc67b12bb72026e2ff87ae47d1da66f74a not found: ID does not exist" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.410393 4744 scope.go:117] "RemoveContainer" containerID="98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347" Sep 30 03:16:48 crc kubenswrapper[4744]: E0930 03:16:48.410644 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347\": container with ID starting with 98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347 not found: ID does not exist" containerID="98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.410672 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347"} err="failed to get container status \"98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347\": rpc error: code = NotFound desc = could not find container \"98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347\": container with ID starting with 98d7282a8d817ccae006e02ceb67795b580b2a3ba3bff2cab303222205368347 not found: ID does not exist" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.427991 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-config" (OuterVolumeSpecName: "config") pod "a54a9e4e-1abf-4e10-a7cb-98c582b531fa" (UID: "a54a9e4e-1abf-4e10-a7cb-98c582b531fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.428447 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a54a9e4e-1abf-4e10-a7cb-98c582b531fa" (UID: "a54a9e4e-1abf-4e10-a7cb-98c582b531fa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.428609 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a54a9e4e-1abf-4e10-a7cb-98c582b531fa" (UID: "a54a9e4e-1abf-4e10-a7cb-98c582b531fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.429197 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a54a9e4e-1abf-4e10-a7cb-98c582b531fa" (UID: "a54a9e4e-1abf-4e10-a7cb-98c582b531fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.447649 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.447699 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.447716 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncmsf\" (UniqueName: \"kubernetes.io/projected/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-kube-api-access-ncmsf\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.447732 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.447744 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.447755 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a54a9e4e-1abf-4e10-a7cb-98c582b531fa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.598078 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-s5klg"] Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.606906 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-s5klg"] Sep 30 03:16:48 crc kubenswrapper[4744]: I0930 03:16:48.691556 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb847fbb7-2fpz5"] Sep 30 03:16:48 crc kubenswrapper[4744]: W0930 03:16:48.695208 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod166b326f_c29c_48e9_b017_034c02b4d448.slice/crio-1e558df1de6ddabea8ae0ec5f65303093aeeb5fa20006ad31092eb2a64ab2388 WatchSource:0}: Error finding container 1e558df1de6ddabea8ae0ec5f65303093aeeb5fa20006ad31092eb2a64ab2388: Status 404 returned error can't find the container with id 1e558df1de6ddabea8ae0ec5f65303093aeeb5fa20006ad31092eb2a64ab2388 Sep 30 03:16:49 crc kubenswrapper[4744]: I0930 03:16:49.282934 4744 generic.go:334] "Generic (PLEG): container finished" podID="166b326f-c29c-48e9-b017-034c02b4d448" containerID="bd0b95c6b7758e9ab97437803472b95b2f34464f351314f88961d29d5d7c1882" exitCode=0 Sep 30 03:16:49 crc kubenswrapper[4744]: I0930 03:16:49.283028 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" event={"ID":"166b326f-c29c-48e9-b017-034c02b4d448","Type":"ContainerDied","Data":"bd0b95c6b7758e9ab97437803472b95b2f34464f351314f88961d29d5d7c1882"} Sep 30 03:16:49 crc kubenswrapper[4744]: I0930 03:16:49.283315 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" event={"ID":"166b326f-c29c-48e9-b017-034c02b4d448","Type":"ContainerStarted","Data":"1e558df1de6ddabea8ae0ec5f65303093aeeb5fa20006ad31092eb2a64ab2388"} Sep 30 03:16:49 crc kubenswrapper[4744]: I0930 03:16:49.515989 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" path="/var/lib/kubelet/pods/a54a9e4e-1abf-4e10-a7cb-98c582b531fa/volumes" Sep 30 03:16:50 crc kubenswrapper[4744]: I0930 03:16:50.334809 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" event={"ID":"166b326f-c29c-48e9-b017-034c02b4d448","Type":"ContainerStarted","Data":"42b0d259242bf66dd3d7e438111b4a5244d8085019f8496196e2352974c859bd"} Sep 30 03:16:50 crc kubenswrapper[4744]: I0930 03:16:50.335070 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:50 crc kubenswrapper[4744]: I0930 03:16:50.365728 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" podStartSLOduration=3.36570422 podStartE2EDuration="3.36570422s" podCreationTimestamp="2025-09-30 03:16:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:16:50.363053998 +0000 UTC m=+1337.536273982" watchObservedRunningTime="2025-09-30 03:16:50.36570422 +0000 UTC m=+1337.538924224" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.242645 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bb847fbb7-2fpz5" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.358234 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759799d765-8qbg5"] Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.358552 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759799d765-8qbg5" podUID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerName="dnsmasq-dns" containerID="cri-o://d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49" gracePeriod=10 Sep 30 03:16:58 crc kubenswrapper[4744]: E0930 03:16:58.400542 4744 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e077c18_78cc_4f7d_b7d0_8ed5adc31a29.slice/crio-d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49.scope\": RecentStats: unable to find data in memory cache]" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.829949 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.897709 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-config\") pod \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.897806 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-sb\") pod \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.897952 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2l2x\" (UniqueName: \"kubernetes.io/projected/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-kube-api-access-p2l2x\") pod \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.897993 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-nb\") pod \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.898032 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-svc\") pod \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.898106 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-swift-storage-0\") pod \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.898189 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-openstack-edpm-ipam\") pod \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\" (UID: \"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29\") " Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.903589 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-kube-api-access-p2l2x" (OuterVolumeSpecName: "kube-api-access-p2l2x") pod "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" (UID: "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29"). InnerVolumeSpecName "kube-api-access-p2l2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.957812 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" (UID: "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.960021 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" (UID: "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.965160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" (UID: "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.976059 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" (UID: "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.976220 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" (UID: "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:58 crc kubenswrapper[4744]: I0930 03:16:58.984829 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-config" (OuterVolumeSpecName: "config") pod "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" (UID: "8e077c18-78cc-4f7d-b7d0-8ed5adc31a29"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.001789 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2l2x\" (UniqueName: \"kubernetes.io/projected/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-kube-api-access-p2l2x\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.001819 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.001828 4744 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.001838 4744 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.001847 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.001854 4744 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-config\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.001862 4744 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.444330 4744 generic.go:334] "Generic (PLEG): container finished" podID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerID="d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49" exitCode=0 Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.444383 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-8qbg5" event={"ID":"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29","Type":"ContainerDied","Data":"d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49"} Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.444408 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-8qbg5" event={"ID":"8e077c18-78cc-4f7d-b7d0-8ed5adc31a29","Type":"ContainerDied","Data":"d210a8bb143d18a6553e035ffb0ccb3eecab77ec8bb38306c0f3781fc412af08"} Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.444413 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-8qbg5" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.444424 4744 scope.go:117] "RemoveContainer" containerID="d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.473325 4744 scope.go:117] "RemoveContainer" containerID="6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.478584 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759799d765-8qbg5"] Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.486446 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759799d765-8qbg5"] Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.517715 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" path="/var/lib/kubelet/pods/8e077c18-78cc-4f7d-b7d0-8ed5adc31a29/volumes" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.533467 4744 scope.go:117] "RemoveContainer" containerID="d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49" Sep 30 03:16:59 crc kubenswrapper[4744]: E0930 03:16:59.533959 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49\": container with ID starting with d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49 not found: ID does not exist" containerID="d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.534001 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49"} err="failed to get container status \"d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49\": rpc error: code = NotFound desc = could not find container \"d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49\": container with ID starting with d5d4ccbeceba7a2385570fc1728c780b64bc59c022129b0864e13fc74b499b49 not found: ID does not exist" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.534030 4744 scope.go:117] "RemoveContainer" containerID="6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5" Sep 30 03:16:59 crc kubenswrapper[4744]: E0930 03:16:59.534532 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5\": container with ID starting with 6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5 not found: ID does not exist" containerID="6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5" Sep 30 03:16:59 crc kubenswrapper[4744]: I0930 03:16:59.534551 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5"} err="failed to get container status \"6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5\": rpc error: code = NotFound desc = could not find container \"6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5\": container with ID starting with 6353cc9855d171a9969ffaa0427c15fb31311e0d2f01aaa4b0bf60feda170af5 not found: ID does not exist" Sep 30 03:17:08 crc kubenswrapper[4744]: I0930 03:17:08.555800 4744 generic.go:334] "Generic (PLEG): container finished" podID="341a2cff-5aae-4952-a8d8-64d5e247d7f9" containerID="6f982afaada728532abab523248fce704daba34b59b8e54f85f542b475b6ecf2" exitCode=0 Sep 30 03:17:08 crc kubenswrapper[4744]: I0930 03:17:08.556406 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"341a2cff-5aae-4952-a8d8-64d5e247d7f9","Type":"ContainerDied","Data":"6f982afaada728532abab523248fce704daba34b59b8e54f85f542b475b6ecf2"} Sep 30 03:17:08 crc kubenswrapper[4744]: I0930 03:17:08.563599 4744 generic.go:334] "Generic (PLEG): container finished" podID="7d180fc4-3fb0-4db5-99d7-913559d8ec2e" containerID="6e01dced2ff7ee2e3f41809bbf3612ee8a80c7725fa61665fd439d042c78ec2e" exitCode=0 Sep 30 03:17:08 crc kubenswrapper[4744]: I0930 03:17:08.563639 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7d180fc4-3fb0-4db5-99d7-913559d8ec2e","Type":"ContainerDied","Data":"6e01dced2ff7ee2e3f41809bbf3612ee8a80c7725fa61665fd439d042c78ec2e"} Sep 30 03:17:09 crc kubenswrapper[4744]: I0930 03:17:09.577030 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7d180fc4-3fb0-4db5-99d7-913559d8ec2e","Type":"ContainerStarted","Data":"e38034c1b5c9b4e0cdde7900606d30d5d0bdbfd22d07c1922eedc8f132a7dd4c"} Sep 30 03:17:09 crc kubenswrapper[4744]: I0930 03:17:09.578077 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 03:17:09 crc kubenswrapper[4744]: I0930 03:17:09.580551 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"341a2cff-5aae-4952-a8d8-64d5e247d7f9","Type":"ContainerStarted","Data":"3d15ee907a69f512ba4bdd89f063db2fa6009b161cb49f5698948388d8515977"} Sep 30 03:17:09 crc kubenswrapper[4744]: I0930 03:17:09.580752 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:17:09 crc kubenswrapper[4744]: I0930 03:17:09.621564 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=33.621536363 podStartE2EDuration="33.621536363s" podCreationTimestamp="2025-09-30 03:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:17:09.603219723 +0000 UTC m=+1356.776439707" watchObservedRunningTime="2025-09-30 03:17:09.621536363 +0000 UTC m=+1356.794756347" Sep 30 03:17:09 crc kubenswrapper[4744]: I0930 03:17:09.648469 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=33.648451268 podStartE2EDuration="33.648451268s" podCreationTimestamp="2025-09-30 03:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:17:09.639812759 +0000 UTC m=+1356.813032733" watchObservedRunningTime="2025-09-30 03:17:09.648451268 +0000 UTC m=+1356.821671242" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.829863 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m"] Sep 30 03:17:11 crc kubenswrapper[4744]: E0930 03:17:11.830604 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerName="init" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.830616 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerName="init" Sep 30 03:17:11 crc kubenswrapper[4744]: E0930 03:17:11.830641 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerName="dnsmasq-dns" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.830651 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerName="dnsmasq-dns" Sep 30 03:17:11 crc kubenswrapper[4744]: E0930 03:17:11.830667 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerName="init" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.830674 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerName="init" Sep 30 03:17:11 crc kubenswrapper[4744]: E0930 03:17:11.830692 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerName="dnsmasq-dns" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.830699 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerName="dnsmasq-dns" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.830880 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e077c18-78cc-4f7d-b7d0-8ed5adc31a29" containerName="dnsmasq-dns" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.830895 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54a9e4e-1abf-4e10-a7cb-98c582b531fa" containerName="dnsmasq-dns" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.831534 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.834911 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.835083 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.835200 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.835302 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:17:11 crc kubenswrapper[4744]: I0930 03:17:11.843030 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m"] Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.011079 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.011154 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.011427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.011536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6x2q\" (UniqueName: \"kubernetes.io/projected/1b061989-0be6-4c0d-800f-05bedb5c9a90-kube-api-access-w6x2q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.114509 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6x2q\" (UniqueName: \"kubernetes.io/projected/1b061989-0be6-4c0d-800f-05bedb5c9a90-kube-api-access-w6x2q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.114693 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.116071 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.116784 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.124592 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.125060 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.126173 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.148610 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6x2q\" (UniqueName: \"kubernetes.io/projected/1b061989-0be6-4c0d-800f-05bedb5c9a90-kube-api-access-w6x2q\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.196197 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:12 crc kubenswrapper[4744]: I0930 03:17:12.811477 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m"] Sep 30 03:17:13 crc kubenswrapper[4744]: I0930 03:17:13.636556 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" event={"ID":"1b061989-0be6-4c0d-800f-05bedb5c9a90","Type":"ContainerStarted","Data":"9d9417bd5e4bfddb82be67b2240a3bb3d13f93ef72779bfe757d305524b63881"} Sep 30 03:17:22 crc kubenswrapper[4744]: I0930 03:17:22.730766 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" event={"ID":"1b061989-0be6-4c0d-800f-05bedb5c9a90","Type":"ContainerStarted","Data":"ccc21c01df3911bc0da687ea0a6eb5aa29d31000bef62bcf6d25ebb1b0c15e5a"} Sep 30 03:17:22 crc kubenswrapper[4744]: I0930 03:17:22.752324 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" podStartSLOduration=2.301163671 podStartE2EDuration="11.75229294s" podCreationTimestamp="2025-09-30 03:17:11 +0000 UTC" firstStartedPulling="2025-09-30 03:17:12.82041774 +0000 UTC m=+1359.993637724" lastFinishedPulling="2025-09-30 03:17:22.271547009 +0000 UTC m=+1369.444766993" observedRunningTime="2025-09-30 03:17:22.746614563 +0000 UTC m=+1369.919834577" watchObservedRunningTime="2025-09-30 03:17:22.75229294 +0000 UTC m=+1369.925512964" Sep 30 03:17:26 crc kubenswrapper[4744]: I0930 03:17:26.536636 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 03:17:26 crc kubenswrapper[4744]: I0930 03:17:26.573728 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 03:17:34 crc kubenswrapper[4744]: I0930 03:17:34.869837 4744 generic.go:334] "Generic (PLEG): container finished" podID="1b061989-0be6-4c0d-800f-05bedb5c9a90" containerID="ccc21c01df3911bc0da687ea0a6eb5aa29d31000bef62bcf6d25ebb1b0c15e5a" exitCode=0 Sep 30 03:17:34 crc kubenswrapper[4744]: I0930 03:17:34.869936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" event={"ID":"1b061989-0be6-4c0d-800f-05bedb5c9a90","Type":"ContainerDied","Data":"ccc21c01df3911bc0da687ea0a6eb5aa29d31000bef62bcf6d25ebb1b0c15e5a"} Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.496598 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.593228 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6x2q\" (UniqueName: \"kubernetes.io/projected/1b061989-0be6-4c0d-800f-05bedb5c9a90-kube-api-access-w6x2q\") pod \"1b061989-0be6-4c0d-800f-05bedb5c9a90\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.593355 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-ssh-key\") pod \"1b061989-0be6-4c0d-800f-05bedb5c9a90\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.593556 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-repo-setup-combined-ca-bundle\") pod \"1b061989-0be6-4c0d-800f-05bedb5c9a90\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.593691 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-inventory\") pod \"1b061989-0be6-4c0d-800f-05bedb5c9a90\" (UID: \"1b061989-0be6-4c0d-800f-05bedb5c9a90\") " Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.600254 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b061989-0be6-4c0d-800f-05bedb5c9a90-kube-api-access-w6x2q" (OuterVolumeSpecName: "kube-api-access-w6x2q") pod "1b061989-0be6-4c0d-800f-05bedb5c9a90" (UID: "1b061989-0be6-4c0d-800f-05bedb5c9a90"). InnerVolumeSpecName "kube-api-access-w6x2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.601910 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1b061989-0be6-4c0d-800f-05bedb5c9a90" (UID: "1b061989-0be6-4c0d-800f-05bedb5c9a90"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.647711 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b061989-0be6-4c0d-800f-05bedb5c9a90" (UID: "1b061989-0be6-4c0d-800f-05bedb5c9a90"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.649138 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-inventory" (OuterVolumeSpecName: "inventory") pod "1b061989-0be6-4c0d-800f-05bedb5c9a90" (UID: "1b061989-0be6-4c0d-800f-05bedb5c9a90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.697450 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6x2q\" (UniqueName: \"kubernetes.io/projected/1b061989-0be6-4c0d-800f-05bedb5c9a90-kube-api-access-w6x2q\") on node \"crc\" DevicePath \"\"" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.697801 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.697937 4744 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.698059 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b061989-0be6-4c0d-800f-05bedb5c9a90-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.896890 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" event={"ID":"1b061989-0be6-4c0d-800f-05bedb5c9a90","Type":"ContainerDied","Data":"9d9417bd5e4bfddb82be67b2240a3bb3d13f93ef72779bfe757d305524b63881"} Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.896950 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d9417bd5e4bfddb82be67b2240a3bb3d13f93ef72779bfe757d305524b63881" Sep 30 03:17:36 crc kubenswrapper[4744]: I0930 03:17:36.896986 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.024724 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4"] Sep 30 03:17:37 crc kubenswrapper[4744]: E0930 03:17:37.025296 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b061989-0be6-4c0d-800f-05bedb5c9a90" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.025326 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b061989-0be6-4c0d-800f-05bedb5c9a90" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.025724 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b061989-0be6-4c0d-800f-05bedb5c9a90" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.026739 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.036727 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4"] Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.051400 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.051563 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.051717 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.051892 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.218490 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.218573 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvm6m\" (UniqueName: \"kubernetes.io/projected/50f909f5-fbe0-489d-bb41-59a3318cd416-kube-api-access-bvm6m\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.218677 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.320492 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.320603 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvm6m\" (UniqueName: \"kubernetes.io/projected/50f909f5-fbe0-489d-bb41-59a3318cd416-kube-api-access-bvm6m\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.320722 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.326066 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.327433 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.338683 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvm6m\" (UniqueName: \"kubernetes.io/projected/50f909f5-fbe0-489d-bb41-59a3318cd416-kube-api-access-bvm6m\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-299d4\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.384007 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:37 crc kubenswrapper[4744]: I0930 03:17:37.959156 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4"] Sep 30 03:17:38 crc kubenswrapper[4744]: I0930 03:17:38.055177 4744 scope.go:117] "RemoveContainer" containerID="24f5619a5975cd51a8e3b5fea396cd20cfb24b932d853638e967b46f49c88350" Sep 30 03:17:38 crc kubenswrapper[4744]: I0930 03:17:38.110855 4744 scope.go:117] "RemoveContainer" containerID="b47da698457c41ad15230a0e3bf737f8c71f90153142d8f17152c9157c30ffd4" Sep 30 03:17:38 crc kubenswrapper[4744]: I0930 03:17:38.154545 4744 scope.go:117] "RemoveContainer" containerID="d35f650e93f4d0a82badcd2cc8fabdf7f9cc9941b77275dfce70d6830b014ba0" Sep 30 03:17:38 crc kubenswrapper[4744]: I0930 03:17:38.924822 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" event={"ID":"50f909f5-fbe0-489d-bb41-59a3318cd416","Type":"ContainerStarted","Data":"a505b3d13defb5343e4278803ba54ee180d87409cbb0a08c4b8e11e55310a669"} Sep 30 03:17:38 crc kubenswrapper[4744]: I0930 03:17:38.925122 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" event={"ID":"50f909f5-fbe0-489d-bb41-59a3318cd416","Type":"ContainerStarted","Data":"20ea5ca5597e638b1f0152c94acd00f19339f0fa9c616a430bb47316ac02ef0a"} Sep 30 03:17:38 crc kubenswrapper[4744]: I0930 03:17:38.948553 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" podStartSLOduration=2.52572115 podStartE2EDuration="2.948531983s" podCreationTimestamp="2025-09-30 03:17:36 +0000 UTC" firstStartedPulling="2025-09-30 03:17:37.958335808 +0000 UTC m=+1385.131555782" lastFinishedPulling="2025-09-30 03:17:38.381146641 +0000 UTC m=+1385.554366615" observedRunningTime="2025-09-30 03:17:38.94327586 +0000 UTC m=+1386.116495844" watchObservedRunningTime="2025-09-30 03:17:38.948531983 +0000 UTC m=+1386.121751957" Sep 30 03:17:41 crc kubenswrapper[4744]: I0930 03:17:41.962484 4744 generic.go:334] "Generic (PLEG): container finished" podID="50f909f5-fbe0-489d-bb41-59a3318cd416" containerID="a505b3d13defb5343e4278803ba54ee180d87409cbb0a08c4b8e11e55310a669" exitCode=0 Sep 30 03:17:41 crc kubenswrapper[4744]: I0930 03:17:41.962746 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" event={"ID":"50f909f5-fbe0-489d-bb41-59a3318cd416","Type":"ContainerDied","Data":"a505b3d13defb5343e4278803ba54ee180d87409cbb0a08c4b8e11e55310a669"} Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.573221 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.765228 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-inventory\") pod \"50f909f5-fbe0-489d-bb41-59a3318cd416\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.765444 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvm6m\" (UniqueName: \"kubernetes.io/projected/50f909f5-fbe0-489d-bb41-59a3318cd416-kube-api-access-bvm6m\") pod \"50f909f5-fbe0-489d-bb41-59a3318cd416\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.765659 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-ssh-key\") pod \"50f909f5-fbe0-489d-bb41-59a3318cd416\" (UID: \"50f909f5-fbe0-489d-bb41-59a3318cd416\") " Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.783892 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f909f5-fbe0-489d-bb41-59a3318cd416-kube-api-access-bvm6m" (OuterVolumeSpecName: "kube-api-access-bvm6m") pod "50f909f5-fbe0-489d-bb41-59a3318cd416" (UID: "50f909f5-fbe0-489d-bb41-59a3318cd416"). InnerVolumeSpecName "kube-api-access-bvm6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.807801 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-inventory" (OuterVolumeSpecName: "inventory") pod "50f909f5-fbe0-489d-bb41-59a3318cd416" (UID: "50f909f5-fbe0-489d-bb41-59a3318cd416"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.839169 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50f909f5-fbe0-489d-bb41-59a3318cd416" (UID: "50f909f5-fbe0-489d-bb41-59a3318cd416"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.872640 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.872696 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f909f5-fbe0-489d-bb41-59a3318cd416-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.872718 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvm6m\" (UniqueName: \"kubernetes.io/projected/50f909f5-fbe0-489d-bb41-59a3318cd416-kube-api-access-bvm6m\") on node \"crc\" DevicePath \"\"" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.989444 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" event={"ID":"50f909f5-fbe0-489d-bb41-59a3318cd416","Type":"ContainerDied","Data":"20ea5ca5597e638b1f0152c94acd00f19339f0fa9c616a430bb47316ac02ef0a"} Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.989502 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20ea5ca5597e638b1f0152c94acd00f19339f0fa9c616a430bb47316ac02ef0a" Sep 30 03:17:43 crc kubenswrapper[4744]: I0930 03:17:43.989578 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-299d4" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.090635 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq"] Sep 30 03:17:44 crc kubenswrapper[4744]: E0930 03:17:44.091276 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f909f5-fbe0-489d-bb41-59a3318cd416" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.091306 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f909f5-fbe0-489d-bb41-59a3318cd416" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.091722 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f909f5-fbe0-489d-bb41-59a3318cd416" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.093030 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.105323 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq"] Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.133064 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.133217 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.133665 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.133969 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.179602 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.179672 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.179724 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxcwj\" (UniqueName: \"kubernetes.io/projected/2abd1aec-872e-4bcb-a05f-c0d04d689489-kube-api-access-lxcwj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.180002 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.282203 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.282686 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.282726 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxcwj\" (UniqueName: \"kubernetes.io/projected/2abd1aec-872e-4bcb-a05f-c0d04d689489-kube-api-access-lxcwj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.282767 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.287001 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.288732 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.288764 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.299972 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxcwj\" (UniqueName: \"kubernetes.io/projected/2abd1aec-872e-4bcb-a05f-c0d04d689489-kube-api-access-lxcwj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:44 crc kubenswrapper[4744]: I0930 03:17:44.446910 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:17:45 crc kubenswrapper[4744]: I0930 03:17:45.035421 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq"] Sep 30 03:17:46 crc kubenswrapper[4744]: I0930 03:17:46.020963 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" event={"ID":"2abd1aec-872e-4bcb-a05f-c0d04d689489","Type":"ContainerStarted","Data":"326b96835a795f2ba10033fea05dce485a669e59d47e34774aac75573e6d016e"} Sep 30 03:17:46 crc kubenswrapper[4744]: I0930 03:17:46.021341 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" event={"ID":"2abd1aec-872e-4bcb-a05f-c0d04d689489","Type":"ContainerStarted","Data":"dc8e6d22f4e31045c186bdcc8e5c820f7ac91a298240e784fc28e127175e88fa"} Sep 30 03:17:46 crc kubenswrapper[4744]: I0930 03:17:46.052508 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" podStartSLOduration=1.596617423 podStartE2EDuration="2.052487952s" podCreationTimestamp="2025-09-30 03:17:44 +0000 UTC" firstStartedPulling="2025-09-30 03:17:45.043176604 +0000 UTC m=+1392.216396588" lastFinishedPulling="2025-09-30 03:17:45.499047113 +0000 UTC m=+1392.672267117" observedRunningTime="2025-09-30 03:17:46.045010909 +0000 UTC m=+1393.218230883" watchObservedRunningTime="2025-09-30 03:17:46.052487952 +0000 UTC m=+1393.225707926" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.194129 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5226b"] Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.197346 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.211953 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5226b"] Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.326211 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-catalog-content\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.326277 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-utilities\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.326539 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mnp\" (UniqueName: \"kubernetes.io/projected/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-kube-api-access-k8mnp\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.428937 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mnp\" (UniqueName: \"kubernetes.io/projected/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-kube-api-access-k8mnp\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.429100 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-catalog-content\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.429166 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-utilities\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.429963 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-utilities\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.430019 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-catalog-content\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.451953 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mnp\" (UniqueName: \"kubernetes.io/projected/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-kube-api-access-k8mnp\") pod \"redhat-operators-5226b\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:18 crc kubenswrapper[4744]: I0930 03:18:18.534234 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:19 crc kubenswrapper[4744]: I0930 03:18:19.033937 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5226b"] Sep 30 03:18:19 crc kubenswrapper[4744]: I0930 03:18:19.429998 4744 generic.go:334] "Generic (PLEG): container finished" podID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerID="84b962894752c9025fd03a9e49dd04ef51b5389aa8898906784fe2421411d992" exitCode=0 Sep 30 03:18:19 crc kubenswrapper[4744]: I0930 03:18:19.430080 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5226b" event={"ID":"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8","Type":"ContainerDied","Data":"84b962894752c9025fd03a9e49dd04ef51b5389aa8898906784fe2421411d992"} Sep 30 03:18:19 crc kubenswrapper[4744]: I0930 03:18:19.430468 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5226b" event={"ID":"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8","Type":"ContainerStarted","Data":"2db6cebf343aaf5304dc284fa4e53d90f111511faded8a4b0468b99219357369"} Sep 30 03:18:21 crc kubenswrapper[4744]: I0930 03:18:21.478553 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5226b" event={"ID":"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8","Type":"ContainerStarted","Data":"739d800a67a9c6ead3c83192d17f010730564e76dc9bd9d3f5a04b1998467740"} Sep 30 03:18:23 crc kubenswrapper[4744]: I0930 03:18:23.503567 4744 generic.go:334] "Generic (PLEG): container finished" podID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerID="739d800a67a9c6ead3c83192d17f010730564e76dc9bd9d3f5a04b1998467740" exitCode=0 Sep 30 03:18:23 crc kubenswrapper[4744]: I0930 03:18:23.530491 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5226b" event={"ID":"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8","Type":"ContainerDied","Data":"739d800a67a9c6ead3c83192d17f010730564e76dc9bd9d3f5a04b1998467740"} Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.522292 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5226b" event={"ID":"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8","Type":"ContainerStarted","Data":"9cabe9d54e2ace64be7cc822129897aa5f5386c101399ef6bef2badc09c593a6"} Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.594231 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5226b" podStartSLOduration=1.9783133899999998 podStartE2EDuration="6.594202274s" podCreationTimestamp="2025-09-30 03:18:18 +0000 UTC" firstStartedPulling="2025-09-30 03:18:19.432057264 +0000 UTC m=+1426.605277238" lastFinishedPulling="2025-09-30 03:18:24.047946108 +0000 UTC m=+1431.221166122" observedRunningTime="2025-09-30 03:18:24.557587227 +0000 UTC m=+1431.730807241" watchObservedRunningTime="2025-09-30 03:18:24.594202274 +0000 UTC m=+1431.767422268" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.606717 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9wj5c"] Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.635264 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.642316 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wj5c"] Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.689447 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-utilities\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.689644 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-catalog-content\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.689793 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wl7x\" (UniqueName: \"kubernetes.io/projected/9ef15423-02c6-4a08-a66f-bfc15c990eb3-kube-api-access-9wl7x\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.791306 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-catalog-content\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.791456 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wl7x\" (UniqueName: \"kubernetes.io/projected/9ef15423-02c6-4a08-a66f-bfc15c990eb3-kube-api-access-9wl7x\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.791563 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-utilities\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.791936 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-catalog-content\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.792040 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-utilities\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.814563 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wl7x\" (UniqueName: \"kubernetes.io/projected/9ef15423-02c6-4a08-a66f-bfc15c990eb3-kube-api-access-9wl7x\") pod \"certified-operators-9wj5c\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:24 crc kubenswrapper[4744]: I0930 03:18:24.982335 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:25 crc kubenswrapper[4744]: I0930 03:18:25.536041 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wj5c"] Sep 30 03:18:25 crc kubenswrapper[4744]: W0930 03:18:25.544788 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef15423_02c6_4a08_a66f_bfc15c990eb3.slice/crio-1af8637d7643fcc26d4f4e98b839f5c7c447592d56bfc24bc3a9b091b2d52f3b WatchSource:0}: Error finding container 1af8637d7643fcc26d4f4e98b839f5c7c447592d56bfc24bc3a9b091b2d52f3b: Status 404 returned error can't find the container with id 1af8637d7643fcc26d4f4e98b839f5c7c447592d56bfc24bc3a9b091b2d52f3b Sep 30 03:18:26 crc kubenswrapper[4744]: I0930 03:18:26.553880 4744 generic.go:334] "Generic (PLEG): container finished" podID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerID="8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15" exitCode=0 Sep 30 03:18:26 crc kubenswrapper[4744]: I0930 03:18:26.554078 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wj5c" event={"ID":"9ef15423-02c6-4a08-a66f-bfc15c990eb3","Type":"ContainerDied","Data":"8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15"} Sep 30 03:18:26 crc kubenswrapper[4744]: I0930 03:18:26.554296 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wj5c" event={"ID":"9ef15423-02c6-4a08-a66f-bfc15c990eb3","Type":"ContainerStarted","Data":"1af8637d7643fcc26d4f4e98b839f5c7c447592d56bfc24bc3a9b091b2d52f3b"} Sep 30 03:18:27 crc kubenswrapper[4744]: I0930 03:18:27.569200 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wj5c" event={"ID":"9ef15423-02c6-4a08-a66f-bfc15c990eb3","Type":"ContainerStarted","Data":"c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d"} Sep 30 03:18:28 crc kubenswrapper[4744]: I0930 03:18:28.535010 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:28 crc kubenswrapper[4744]: I0930 03:18:28.535085 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:28 crc kubenswrapper[4744]: I0930 03:18:28.586040 4744 generic.go:334] "Generic (PLEG): container finished" podID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerID="c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d" exitCode=0 Sep 30 03:18:28 crc kubenswrapper[4744]: I0930 03:18:28.586105 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wj5c" event={"ID":"9ef15423-02c6-4a08-a66f-bfc15c990eb3","Type":"ContainerDied","Data":"c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d"} Sep 30 03:18:29 crc kubenswrapper[4744]: I0930 03:18:29.592618 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5226b" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="registry-server" probeResult="failure" output=< Sep 30 03:18:29 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 03:18:29 crc kubenswrapper[4744]: > Sep 30 03:18:29 crc kubenswrapper[4744]: I0930 03:18:29.599277 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wj5c" event={"ID":"9ef15423-02c6-4a08-a66f-bfc15c990eb3","Type":"ContainerStarted","Data":"6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55"} Sep 30 03:18:29 crc kubenswrapper[4744]: I0930 03:18:29.631389 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9wj5c" podStartSLOduration=3.021869734 podStartE2EDuration="5.631353251s" podCreationTimestamp="2025-09-30 03:18:24 +0000 UTC" firstStartedPulling="2025-09-30 03:18:26.556919623 +0000 UTC m=+1433.730139637" lastFinishedPulling="2025-09-30 03:18:29.16640318 +0000 UTC m=+1436.339623154" observedRunningTime="2025-09-30 03:18:29.621751292 +0000 UTC m=+1436.794971256" watchObservedRunningTime="2025-09-30 03:18:29.631353251 +0000 UTC m=+1436.804573245" Sep 30 03:18:34 crc kubenswrapper[4744]: I0930 03:18:34.347579 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:18:34 crc kubenswrapper[4744]: I0930 03:18:34.348218 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:18:34 crc kubenswrapper[4744]: I0930 03:18:34.983703 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:34 crc kubenswrapper[4744]: I0930 03:18:34.983803 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:35 crc kubenswrapper[4744]: I0930 03:18:35.067344 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:35 crc kubenswrapper[4744]: I0930 03:18:35.771230 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:35 crc kubenswrapper[4744]: I0930 03:18:35.855226 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wj5c"] Sep 30 03:18:37 crc kubenswrapper[4744]: I0930 03:18:37.722336 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9wj5c" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="registry-server" containerID="cri-o://6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55" gracePeriod=2 Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.252203 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.303618 4744 scope.go:117] "RemoveContainer" containerID="3a22f35c6d1fc6b15165df25b9cf2424bf9c5f9423ecc4446132535757f7b8f9" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.311783 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wl7x\" (UniqueName: \"kubernetes.io/projected/9ef15423-02c6-4a08-a66f-bfc15c990eb3-kube-api-access-9wl7x\") pod \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.311828 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-catalog-content\") pod \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.312116 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-utilities\") pod \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\" (UID: \"9ef15423-02c6-4a08-a66f-bfc15c990eb3\") " Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.313332 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-utilities" (OuterVolumeSpecName: "utilities") pod "9ef15423-02c6-4a08-a66f-bfc15c990eb3" (UID: "9ef15423-02c6-4a08-a66f-bfc15c990eb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.321315 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef15423-02c6-4a08-a66f-bfc15c990eb3-kube-api-access-9wl7x" (OuterVolumeSpecName: "kube-api-access-9wl7x") pod "9ef15423-02c6-4a08-a66f-bfc15c990eb3" (UID: "9ef15423-02c6-4a08-a66f-bfc15c990eb3"). InnerVolumeSpecName "kube-api-access-9wl7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.351819 4744 scope.go:117] "RemoveContainer" containerID="d9552bee01f64bff75395f2036cae62211d09736edb53991f1d5a0ce1386546d" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.414405 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wl7x\" (UniqueName: \"kubernetes.io/projected/9ef15423-02c6-4a08-a66f-bfc15c990eb3-kube-api-access-9wl7x\") on node \"crc\" DevicePath \"\"" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.414435 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.436880 4744 scope.go:117] "RemoveContainer" containerID="e631872fe00b62bf45d049cdcceac111c1486239a0419fb604ef4c6fa813737f" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.499702 4744 scope.go:117] "RemoveContainer" containerID="247926f8f3cadac7f088a36ccd7fa8f1c667eaac56ae4420f2a8ec7c7c5b1c5b" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.596854 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.660981 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.734365 4744 generic.go:334] "Generic (PLEG): container finished" podID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerID="6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55" exitCode=0 Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.734393 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wj5c" event={"ID":"9ef15423-02c6-4a08-a66f-bfc15c990eb3","Type":"ContainerDied","Data":"6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55"} Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.734434 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wj5c" event={"ID":"9ef15423-02c6-4a08-a66f-bfc15c990eb3","Type":"ContainerDied","Data":"1af8637d7643fcc26d4f4e98b839f5c7c447592d56bfc24bc3a9b091b2d52f3b"} Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.734464 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wj5c" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.734480 4744 scope.go:117] "RemoveContainer" containerID="6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.782769 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ef15423-02c6-4a08-a66f-bfc15c990eb3" (UID: "9ef15423-02c6-4a08-a66f-bfc15c990eb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.802866 4744 scope.go:117] "RemoveContainer" containerID="c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.829881 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef15423-02c6-4a08-a66f-bfc15c990eb3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.832152 4744 scope.go:117] "RemoveContainer" containerID="8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.918960 4744 scope.go:117] "RemoveContainer" containerID="6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55" Sep 30 03:18:38 crc kubenswrapper[4744]: E0930 03:18:38.919418 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55\": container with ID starting with 6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55 not found: ID does not exist" containerID="6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.919458 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55"} err="failed to get container status \"6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55\": rpc error: code = NotFound desc = could not find container \"6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55\": container with ID starting with 6110b66d7979559176fca54ca9d863d9f1b7d7371dc41b7ab86f6fdffd6fff55 not found: ID does not exist" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.919486 4744 scope.go:117] "RemoveContainer" containerID="c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d" Sep 30 03:18:38 crc kubenswrapper[4744]: E0930 03:18:38.919742 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d\": container with ID starting with c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d not found: ID does not exist" containerID="c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.919782 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d"} err="failed to get container status \"c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d\": rpc error: code = NotFound desc = could not find container \"c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d\": container with ID starting with c13720ac3e865ddbc1a44c1119fbcee85f6260f4e0c5fc700ed4db404e120f9d not found: ID does not exist" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.919803 4744 scope.go:117] "RemoveContainer" containerID="8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15" Sep 30 03:18:38 crc kubenswrapper[4744]: E0930 03:18:38.920154 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15\": container with ID starting with 8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15 not found: ID does not exist" containerID="8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15" Sep 30 03:18:38 crc kubenswrapper[4744]: I0930 03:18:38.920197 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15"} err="failed to get container status \"8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15\": rpc error: code = NotFound desc = could not find container \"8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15\": container with ID starting with 8b3111f8ccf12d73e2cd97482ba5c695d36982fb216a45513387e267c653bf15 not found: ID does not exist" Sep 30 03:18:39 crc kubenswrapper[4744]: I0930 03:18:39.076891 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wj5c"] Sep 30 03:18:39 crc kubenswrapper[4744]: I0930 03:18:39.086322 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9wj5c"] Sep 30 03:18:39 crc kubenswrapper[4744]: I0930 03:18:39.517542 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" path="/var/lib/kubelet/pods/9ef15423-02c6-4a08-a66f-bfc15c990eb3/volumes" Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.337302 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5226b"] Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.337584 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5226b" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="registry-server" containerID="cri-o://9cabe9d54e2ace64be7cc822129897aa5f5386c101399ef6bef2badc09c593a6" gracePeriod=2 Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.764595 4744 generic.go:334] "Generic (PLEG): container finished" podID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerID="9cabe9d54e2ace64be7cc822129897aa5f5386c101399ef6bef2badc09c593a6" exitCode=0 Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.764893 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5226b" event={"ID":"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8","Type":"ContainerDied","Data":"9cabe9d54e2ace64be7cc822129897aa5f5386c101399ef6bef2badc09c593a6"} Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.880030 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.986981 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8mnp\" (UniqueName: \"kubernetes.io/projected/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-kube-api-access-k8mnp\") pod \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.987386 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-catalog-content\") pod \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.987530 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-utilities\") pod \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\" (UID: \"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8\") " Sep 30 03:18:40 crc kubenswrapper[4744]: I0930 03:18:40.988760 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-utilities" (OuterVolumeSpecName: "utilities") pod "97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" (UID: "97ac7958-3b6a-4bf2-9160-784d4ad0e4f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.008629 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-kube-api-access-k8mnp" (OuterVolumeSpecName: "kube-api-access-k8mnp") pod "97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" (UID: "97ac7958-3b6a-4bf2-9160-784d4ad0e4f8"). InnerVolumeSpecName "kube-api-access-k8mnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.072478 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" (UID: "97ac7958-3b6a-4bf2-9160-784d4ad0e4f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.089588 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8mnp\" (UniqueName: \"kubernetes.io/projected/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-kube-api-access-k8mnp\") on node \"crc\" DevicePath \"\"" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.089615 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.089626 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.778898 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5226b" event={"ID":"97ac7958-3b6a-4bf2-9160-784d4ad0e4f8","Type":"ContainerDied","Data":"2db6cebf343aaf5304dc284fa4e53d90f111511faded8a4b0468b99219357369"} Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.778978 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5226b" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.779183 4744 scope.go:117] "RemoveContainer" containerID="9cabe9d54e2ace64be7cc822129897aa5f5386c101399ef6bef2badc09c593a6" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.805208 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5226b"] Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.812091 4744 scope.go:117] "RemoveContainer" containerID="739d800a67a9c6ead3c83192d17f010730564e76dc9bd9d3f5a04b1998467740" Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.815032 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5226b"] Sep 30 03:18:41 crc kubenswrapper[4744]: I0930 03:18:41.852544 4744 scope.go:117] "RemoveContainer" containerID="84b962894752c9025fd03a9e49dd04ef51b5389aa8898906784fe2421411d992" Sep 30 03:18:43 crc kubenswrapper[4744]: I0930 03:18:43.522253 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" path="/var/lib/kubelet/pods/97ac7958-3b6a-4bf2-9160-784d4ad0e4f8/volumes" Sep 30 03:19:04 crc kubenswrapper[4744]: I0930 03:19:04.347545 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:19:04 crc kubenswrapper[4744]: I0930 03:19:04.348223 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.542822 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fjgh2"] Sep 30 03:19:18 crc kubenswrapper[4744]: E0930 03:19:18.544023 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="registry-server" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544045 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="registry-server" Sep 30 03:19:18 crc kubenswrapper[4744]: E0930 03:19:18.544077 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="extract-content" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544089 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="extract-content" Sep 30 03:19:18 crc kubenswrapper[4744]: E0930 03:19:18.544114 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="extract-utilities" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544128 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="extract-utilities" Sep 30 03:19:18 crc kubenswrapper[4744]: E0930 03:19:18.544197 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="registry-server" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544210 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="registry-server" Sep 30 03:19:18 crc kubenswrapper[4744]: E0930 03:19:18.544227 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="extract-utilities" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544239 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="extract-utilities" Sep 30 03:19:18 crc kubenswrapper[4744]: E0930 03:19:18.544255 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="extract-content" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544267 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="extract-content" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544819 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ac7958-3b6a-4bf2-9160-784d4ad0e4f8" containerName="registry-server" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.544857 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef15423-02c6-4a08-a66f-bfc15c990eb3" containerName="registry-server" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.547704 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.558715 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fjgh2"] Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.627232 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w2mr\" (UniqueName: \"kubernetes.io/projected/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-kube-api-access-5w2mr\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.627580 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-utilities\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.627638 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-catalog-content\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.729151 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w2mr\" (UniqueName: \"kubernetes.io/projected/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-kube-api-access-5w2mr\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.729259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-utilities\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.729321 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-catalog-content\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.730036 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-catalog-content\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.730471 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-utilities\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.763766 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w2mr\" (UniqueName: \"kubernetes.io/projected/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-kube-api-access-5w2mr\") pod \"community-operators-fjgh2\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:18 crc kubenswrapper[4744]: I0930 03:19:18.869669 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:19 crc kubenswrapper[4744]: I0930 03:19:19.420834 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fjgh2"] Sep 30 03:19:20 crc kubenswrapper[4744]: I0930 03:19:20.344060 4744 generic.go:334] "Generic (PLEG): container finished" podID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerID="221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e" exitCode=0 Sep 30 03:19:20 crc kubenswrapper[4744]: I0930 03:19:20.344482 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjgh2" event={"ID":"b2c07857-b31c-4910-a7f2-d6ceffc5c35c","Type":"ContainerDied","Data":"221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e"} Sep 30 03:19:20 crc kubenswrapper[4744]: I0930 03:19:20.344528 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjgh2" event={"ID":"b2c07857-b31c-4910-a7f2-d6ceffc5c35c","Type":"ContainerStarted","Data":"1cbc5ebadc3606a6491b840507a83510fa9bd1e339114e085531704238a0afb0"} Sep 30 03:19:22 crc kubenswrapper[4744]: I0930 03:19:22.365445 4744 generic.go:334] "Generic (PLEG): container finished" podID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerID="00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c" exitCode=0 Sep 30 03:19:22 crc kubenswrapper[4744]: I0930 03:19:22.365560 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjgh2" event={"ID":"b2c07857-b31c-4910-a7f2-d6ceffc5c35c","Type":"ContainerDied","Data":"00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c"} Sep 30 03:19:23 crc kubenswrapper[4744]: I0930 03:19:23.379966 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjgh2" event={"ID":"b2c07857-b31c-4910-a7f2-d6ceffc5c35c","Type":"ContainerStarted","Data":"0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d"} Sep 30 03:19:23 crc kubenswrapper[4744]: I0930 03:19:23.402167 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fjgh2" podStartSLOduration=2.950448809 podStartE2EDuration="5.402118085s" podCreationTimestamp="2025-09-30 03:19:18 +0000 UTC" firstStartedPulling="2025-09-30 03:19:20.347973848 +0000 UTC m=+1487.521193832" lastFinishedPulling="2025-09-30 03:19:22.799643104 +0000 UTC m=+1489.972863108" observedRunningTime="2025-09-30 03:19:23.397714309 +0000 UTC m=+1490.570934323" watchObservedRunningTime="2025-09-30 03:19:23.402118085 +0000 UTC m=+1490.575338089" Sep 30 03:19:28 crc kubenswrapper[4744]: I0930 03:19:28.869996 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:28 crc kubenswrapper[4744]: I0930 03:19:28.870572 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:28 crc kubenswrapper[4744]: I0930 03:19:28.947902 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:29 crc kubenswrapper[4744]: I0930 03:19:29.531549 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:29 crc kubenswrapper[4744]: I0930 03:19:29.612479 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fjgh2"] Sep 30 03:19:31 crc kubenswrapper[4744]: I0930 03:19:31.479209 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fjgh2" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="registry-server" containerID="cri-o://0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d" gracePeriod=2 Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.032638 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.159340 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w2mr\" (UniqueName: \"kubernetes.io/projected/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-kube-api-access-5w2mr\") pod \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.159576 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-utilities\") pod \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.159683 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-catalog-content\") pod \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\" (UID: \"b2c07857-b31c-4910-a7f2-d6ceffc5c35c\") " Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.161012 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-utilities" (OuterVolumeSpecName: "utilities") pod "b2c07857-b31c-4910-a7f2-d6ceffc5c35c" (UID: "b2c07857-b31c-4910-a7f2-d6ceffc5c35c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.168085 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-kube-api-access-5w2mr" (OuterVolumeSpecName: "kube-api-access-5w2mr") pod "b2c07857-b31c-4910-a7f2-d6ceffc5c35c" (UID: "b2c07857-b31c-4910-a7f2-d6ceffc5c35c"). InnerVolumeSpecName "kube-api-access-5w2mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.206299 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2c07857-b31c-4910-a7f2-d6ceffc5c35c" (UID: "b2c07857-b31c-4910-a7f2-d6ceffc5c35c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.269628 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w2mr\" (UniqueName: \"kubernetes.io/projected/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-kube-api-access-5w2mr\") on node \"crc\" DevicePath \"\"" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.269767 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.269791 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c07857-b31c-4910-a7f2-d6ceffc5c35c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.496801 4744 generic.go:334] "Generic (PLEG): container finished" podID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerID="0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d" exitCode=0 Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.496892 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjgh2" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.496927 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjgh2" event={"ID":"b2c07857-b31c-4910-a7f2-d6ceffc5c35c","Type":"ContainerDied","Data":"0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d"} Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.497292 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjgh2" event={"ID":"b2c07857-b31c-4910-a7f2-d6ceffc5c35c","Type":"ContainerDied","Data":"1cbc5ebadc3606a6491b840507a83510fa9bd1e339114e085531704238a0afb0"} Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.497342 4744 scope.go:117] "RemoveContainer" containerID="0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.537841 4744 scope.go:117] "RemoveContainer" containerID="00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.563169 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fjgh2"] Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.575981 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fjgh2"] Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.577122 4744 scope.go:117] "RemoveContainer" containerID="221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.661196 4744 scope.go:117] "RemoveContainer" containerID="0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d" Sep 30 03:19:32 crc kubenswrapper[4744]: E0930 03:19:32.661639 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d\": container with ID starting with 0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d not found: ID does not exist" containerID="0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.661672 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d"} err="failed to get container status \"0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d\": rpc error: code = NotFound desc = could not find container \"0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d\": container with ID starting with 0b5c5cfeaea9a756966a906e8c43fa49a7c870fab1a63ddc72bef4caa82f2b2d not found: ID does not exist" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.661695 4744 scope.go:117] "RemoveContainer" containerID="00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c" Sep 30 03:19:32 crc kubenswrapper[4744]: E0930 03:19:32.662043 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c\": container with ID starting with 00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c not found: ID does not exist" containerID="00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.662068 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c"} err="failed to get container status \"00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c\": rpc error: code = NotFound desc = could not find container \"00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c\": container with ID starting with 00a5e46f0391322cb088754e0a4d1f6ef1d3f7cf6593cc6c7fdbfeee42294a3c not found: ID does not exist" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.662085 4744 scope.go:117] "RemoveContainer" containerID="221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e" Sep 30 03:19:32 crc kubenswrapper[4744]: E0930 03:19:32.662299 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e\": container with ID starting with 221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e not found: ID does not exist" containerID="221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e" Sep 30 03:19:32 crc kubenswrapper[4744]: I0930 03:19:32.662319 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e"} err="failed to get container status \"221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e\": rpc error: code = NotFound desc = could not find container \"221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e\": container with ID starting with 221c868c9ef3bbd8c6ccc3103f2f5737fb4a89e200ac6e5baf2d74876998095e not found: ID does not exist" Sep 30 03:19:33 crc kubenswrapper[4744]: I0930 03:19:33.523197 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" path="/var/lib/kubelet/pods/b2c07857-b31c-4910-a7f2-d6ceffc5c35c/volumes" Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.347284 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.347724 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.347777 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.348438 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.348516 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" gracePeriod=600 Sep 30 03:19:34 crc kubenswrapper[4744]: E0930 03:19:34.475717 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.539072 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" exitCode=0 Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.539134 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f"} Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.539327 4744 scope.go:117] "RemoveContainer" containerID="b7975d758249d48351e6b790ced251dcf0b3dce30fe61d2854cf2d73cc541951" Sep 30 03:19:34 crc kubenswrapper[4744]: I0930 03:19:34.542145 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:19:34 crc kubenswrapper[4744]: E0930 03:19:34.542737 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:19:47 crc kubenswrapper[4744]: I0930 03:19:47.503999 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:19:47 crc kubenswrapper[4744]: E0930 03:19:47.504843 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:20:00 crc kubenswrapper[4744]: I0930 03:20:00.505496 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:20:00 crc kubenswrapper[4744]: E0930 03:20:00.506479 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:20:12 crc kubenswrapper[4744]: I0930 03:20:12.504418 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:20:12 crc kubenswrapper[4744]: E0930 03:20:12.505511 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:20:23 crc kubenswrapper[4744]: I0930 03:20:23.512585 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:20:23 crc kubenswrapper[4744]: E0930 03:20:23.513594 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:20:34 crc kubenswrapper[4744]: I0930 03:20:34.503602 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:20:34 crc kubenswrapper[4744]: E0930 03:20:34.504685 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:20:47 crc kubenswrapper[4744]: I0930 03:20:47.505144 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:20:47 crc kubenswrapper[4744]: E0930 03:20:47.506358 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:20:50 crc kubenswrapper[4744]: I0930 03:20:50.521769 4744 generic.go:334] "Generic (PLEG): container finished" podID="2abd1aec-872e-4bcb-a05f-c0d04d689489" containerID="326b96835a795f2ba10033fea05dce485a669e59d47e34774aac75573e6d016e" exitCode=0 Sep 30 03:20:50 crc kubenswrapper[4744]: I0930 03:20:50.521901 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" event={"ID":"2abd1aec-872e-4bcb-a05f-c0d04d689489","Type":"ContainerDied","Data":"326b96835a795f2ba10033fea05dce485a669e59d47e34774aac75573e6d016e"} Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.046974 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.129216 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-ssh-key\") pod \"2abd1aec-872e-4bcb-a05f-c0d04d689489\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.129402 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-inventory\") pod \"2abd1aec-872e-4bcb-a05f-c0d04d689489\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.129581 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-bootstrap-combined-ca-bundle\") pod \"2abd1aec-872e-4bcb-a05f-c0d04d689489\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.129700 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxcwj\" (UniqueName: \"kubernetes.io/projected/2abd1aec-872e-4bcb-a05f-c0d04d689489-kube-api-access-lxcwj\") pod \"2abd1aec-872e-4bcb-a05f-c0d04d689489\" (UID: \"2abd1aec-872e-4bcb-a05f-c0d04d689489\") " Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.134553 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abd1aec-872e-4bcb-a05f-c0d04d689489-kube-api-access-lxcwj" (OuterVolumeSpecName: "kube-api-access-lxcwj") pod "2abd1aec-872e-4bcb-a05f-c0d04d689489" (UID: "2abd1aec-872e-4bcb-a05f-c0d04d689489"). InnerVolumeSpecName "kube-api-access-lxcwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.134965 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2abd1aec-872e-4bcb-a05f-c0d04d689489" (UID: "2abd1aec-872e-4bcb-a05f-c0d04d689489"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.157587 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-inventory" (OuterVolumeSpecName: "inventory") pod "2abd1aec-872e-4bcb-a05f-c0d04d689489" (UID: "2abd1aec-872e-4bcb-a05f-c0d04d689489"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.182308 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2abd1aec-872e-4bcb-a05f-c0d04d689489" (UID: "2abd1aec-872e-4bcb-a05f-c0d04d689489"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.231646 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.231675 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.231688 4744 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2abd1aec-872e-4bcb-a05f-c0d04d689489-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.231699 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxcwj\" (UniqueName: \"kubernetes.io/projected/2abd1aec-872e-4bcb-a05f-c0d04d689489-kube-api-access-lxcwj\") on node \"crc\" DevicePath \"\"" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.554768 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" event={"ID":"2abd1aec-872e-4bcb-a05f-c0d04d689489","Type":"ContainerDied","Data":"dc8e6d22f4e31045c186bdcc8e5c820f7ac91a298240e784fc28e127175e88fa"} Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.554827 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc8e6d22f4e31045c186bdcc8e5c820f7ac91a298240e784fc28e127175e88fa" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.554849 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.696549 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2"] Sep 30 03:20:52 crc kubenswrapper[4744]: E0930 03:20:52.697156 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="extract-utilities" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.697204 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="extract-utilities" Sep 30 03:20:52 crc kubenswrapper[4744]: E0930 03:20:52.697227 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="extract-content" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.697238 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="extract-content" Sep 30 03:20:52 crc kubenswrapper[4744]: E0930 03:20:52.697264 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="registry-server" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.697274 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="registry-server" Sep 30 03:20:52 crc kubenswrapper[4744]: E0930 03:20:52.697300 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abd1aec-872e-4bcb-a05f-c0d04d689489" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.697311 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abd1aec-872e-4bcb-a05f-c0d04d689489" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.697646 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abd1aec-872e-4bcb-a05f-c0d04d689489" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.697670 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c07857-b31c-4910-a7f2-d6ceffc5c35c" containerName="registry-server" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.698654 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.703116 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.703736 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.704013 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.704009 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.713533 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2"] Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.846692 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.847258 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n99c4\" (UniqueName: \"kubernetes.io/projected/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-kube-api-access-n99c4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.847462 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.950016 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.950261 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n99c4\" (UniqueName: \"kubernetes.io/projected/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-kube-api-access-n99c4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.950304 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.957243 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.975708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:52 crc kubenswrapper[4744]: I0930 03:20:52.981641 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n99c4\" (UniqueName: \"kubernetes.io/projected/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-kube-api-access-n99c4\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:53 crc kubenswrapper[4744]: I0930 03:20:53.040872 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:20:53 crc kubenswrapper[4744]: I0930 03:20:53.459639 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:20:53 crc kubenswrapper[4744]: I0930 03:20:53.466284 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2"] Sep 30 03:20:53 crc kubenswrapper[4744]: I0930 03:20:53.567296 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" event={"ID":"1ec7b740-1236-48b8-9aa5-0fd0c2f64380","Type":"ContainerStarted","Data":"f56807c6c5ea66ca564d86766ffee2d304375eea4cbe675cdb9929fa2e2c6f40"} Sep 30 03:20:54 crc kubenswrapper[4744]: I0930 03:20:54.580934 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" event={"ID":"1ec7b740-1236-48b8-9aa5-0fd0c2f64380","Type":"ContainerStarted","Data":"9c9d87c9ba14b911a8ed87dd992556b45ddab517f75dc841eda46b0ee5c3456a"} Sep 30 03:20:54 crc kubenswrapper[4744]: I0930 03:20:54.614771 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" podStartSLOduration=2.148808891 podStartE2EDuration="2.614750081s" podCreationTimestamp="2025-09-30 03:20:52 +0000 UTC" firstStartedPulling="2025-09-30 03:20:53.459274907 +0000 UTC m=+1580.632494901" lastFinishedPulling="2025-09-30 03:20:53.925216067 +0000 UTC m=+1581.098436091" observedRunningTime="2025-09-30 03:20:54.601805689 +0000 UTC m=+1581.775025713" watchObservedRunningTime="2025-09-30 03:20:54.614750081 +0000 UTC m=+1581.787970065" Sep 30 03:20:59 crc kubenswrapper[4744]: I0930 03:20:59.505111 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:20:59 crc kubenswrapper[4744]: E0930 03:20:59.505925 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:21:10 crc kubenswrapper[4744]: I0930 03:21:10.504529 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:21:10 crc kubenswrapper[4744]: E0930 03:21:10.505299 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:21:23 crc kubenswrapper[4744]: I0930 03:21:23.509464 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:21:23 crc kubenswrapper[4744]: E0930 03:21:23.510265 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:21:38 crc kubenswrapper[4744]: I0930 03:21:38.504328 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:21:38 crc kubenswrapper[4744]: E0930 03:21:38.505040 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:21:48 crc kubenswrapper[4744]: I0930 03:21:48.057010 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n78cc"] Sep 30 03:21:48 crc kubenswrapper[4744]: I0930 03:21:48.070020 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n78cc"] Sep 30 03:21:49 crc kubenswrapper[4744]: I0930 03:21:49.504027 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:21:49 crc kubenswrapper[4744]: E0930 03:21:49.504679 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:21:49 crc kubenswrapper[4744]: I0930 03:21:49.517429 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455eafbf-df1e-4746-ad86-5959ce329b4e" path="/var/lib/kubelet/pods/455eafbf-df1e-4746-ad86-5959ce329b4e/volumes" Sep 30 03:21:51 crc kubenswrapper[4744]: I0930 03:21:51.056094 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-c75vh"] Sep 30 03:21:51 crc kubenswrapper[4744]: I0930 03:21:51.071087 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-c75vh"] Sep 30 03:21:51 crc kubenswrapper[4744]: I0930 03:21:51.081413 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hnh57"] Sep 30 03:21:51 crc kubenswrapper[4744]: I0930 03:21:51.091430 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hnh57"] Sep 30 03:21:51 crc kubenswrapper[4744]: I0930 03:21:51.518671 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de338fd0-f1cc-4fcb-b690-6777c2da57ce" path="/var/lib/kubelet/pods/de338fd0-f1cc-4fcb-b690-6777c2da57ce/volumes" Sep 30 03:21:51 crc kubenswrapper[4744]: I0930 03:21:51.522692 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a89fd1-cf25-4278-974d-e3d51a3ee539" path="/var/lib/kubelet/pods/e1a89fd1-cf25-4278-974d-e3d51a3ee539/volumes" Sep 30 03:22:01 crc kubenswrapper[4744]: I0930 03:22:01.058454 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3f9a-account-create-jztrn"] Sep 30 03:22:01 crc kubenswrapper[4744]: I0930 03:22:01.074678 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1cd7-account-create-sghlw"] Sep 30 03:22:01 crc kubenswrapper[4744]: I0930 03:22:01.083805 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1cd7-account-create-sghlw"] Sep 30 03:22:01 crc kubenswrapper[4744]: I0930 03:22:01.094027 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3f9a-account-create-jztrn"] Sep 30 03:22:01 crc kubenswrapper[4744]: I0930 03:22:01.523780 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee108d7-7bde-43d7-8ae6-998fca50beda" path="/var/lib/kubelet/pods/dee108d7-7bde-43d7-8ae6-998fca50beda/volumes" Sep 30 03:22:01 crc kubenswrapper[4744]: I0930 03:22:01.526921 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ed2a0b-0154-4130-b676-15608db7b540" path="/var/lib/kubelet/pods/e2ed2a0b-0154-4130-b676-15608db7b540/volumes" Sep 30 03:22:02 crc kubenswrapper[4744]: I0930 03:22:02.503676 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:22:02 crc kubenswrapper[4744]: E0930 03:22:02.503937 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:22:08 crc kubenswrapper[4744]: I0930 03:22:08.043033 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8331-account-create-gqdbp"] Sep 30 03:22:08 crc kubenswrapper[4744]: I0930 03:22:08.062444 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8331-account-create-gqdbp"] Sep 30 03:22:09 crc kubenswrapper[4744]: I0930 03:22:09.518974 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14222f9-2d27-470f-938b-95b8176185ba" path="/var/lib/kubelet/pods/d14222f9-2d27-470f-938b-95b8176185ba/volumes" Sep 30 03:22:14 crc kubenswrapper[4744]: I0930 03:22:14.503603 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:22:14 crc kubenswrapper[4744]: E0930 03:22:14.504917 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.048439 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-p2lq5"] Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.067551 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-p2lq5"] Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.076389 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jt29c"] Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.085157 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-4mczb"] Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.094089 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tncc7"] Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.101564 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jt29c"] Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.107611 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-4mczb"] Sep 30 03:22:20 crc kubenswrapper[4744]: I0930 03:22:20.113701 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tncc7"] Sep 30 03:22:21 crc kubenswrapper[4744]: I0930 03:22:21.528228 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa050c7-72ad-4eba-8fee-9990ca164f78" path="/var/lib/kubelet/pods/1aa050c7-72ad-4eba-8fee-9990ca164f78/volumes" Sep 30 03:22:21 crc kubenswrapper[4744]: I0930 03:22:21.530477 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8221af2a-5c5a-4b77-82da-57f20d0e50c7" path="/var/lib/kubelet/pods/8221af2a-5c5a-4b77-82da-57f20d0e50c7/volumes" Sep 30 03:22:21 crc kubenswrapper[4744]: I0930 03:22:21.531970 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890bdf76-6137-4d6f-b87c-30f5e215cd21" path="/var/lib/kubelet/pods/890bdf76-6137-4d6f-b87c-30f5e215cd21/volumes" Sep 30 03:22:21 crc kubenswrapper[4744]: I0930 03:22:21.533266 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e9b9bc-414a-4ddf-8757-6fd3ea97306e" path="/var/lib/kubelet/pods/e3e9b9bc-414a-4ddf-8757-6fd3ea97306e/volumes" Sep 30 03:22:27 crc kubenswrapper[4744]: I0930 03:22:27.503614 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:22:27 crc kubenswrapper[4744]: E0930 03:22:27.504755 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.048454 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-b39b-account-create-vdd2g"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.055782 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9eca-account-create-n9msd"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.063432 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-21d3-account-create-w8rr5"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.070833 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86ab-account-create-7bn4b"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.079420 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-b39b-account-create-vdd2g"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.087514 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-86ab-account-create-7bn4b"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.099828 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-21d3-account-create-w8rr5"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.107747 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9eca-account-create-n9msd"] Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.892851 4744 generic.go:334] "Generic (PLEG): container finished" podID="1ec7b740-1236-48b8-9aa5-0fd0c2f64380" containerID="9c9d87c9ba14b911a8ed87dd992556b45ddab517f75dc841eda46b0ee5c3456a" exitCode=0 Sep 30 03:22:32 crc kubenswrapper[4744]: I0930 03:22:32.892911 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" event={"ID":"1ec7b740-1236-48b8-9aa5-0fd0c2f64380","Type":"ContainerDied","Data":"9c9d87c9ba14b911a8ed87dd992556b45ddab517f75dc841eda46b0ee5c3456a"} Sep 30 03:22:33 crc kubenswrapper[4744]: I0930 03:22:33.529313 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f58173-fb96-4909-ae89-9648dddcedf0" path="/var/lib/kubelet/pods/11f58173-fb96-4909-ae89-9648dddcedf0/volumes" Sep 30 03:22:33 crc kubenswrapper[4744]: I0930 03:22:33.530507 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea" path="/var/lib/kubelet/pods/1c1aaa7c-986e-4b9e-b8a2-fc913a6230ea/volumes" Sep 30 03:22:33 crc kubenswrapper[4744]: I0930 03:22:33.531185 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26cadf80-2d75-4df3-a182-17078060bd12" path="/var/lib/kubelet/pods/26cadf80-2d75-4df3-a182-17078060bd12/volumes" Sep 30 03:22:33 crc kubenswrapper[4744]: I0930 03:22:33.532519 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3adb9ba0-b837-4ba5-9fa9-faa07f3dc718" path="/var/lib/kubelet/pods/3adb9ba0-b837-4ba5-9fa9-faa07f3dc718/volumes" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.374225 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.413264 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-ssh-key\") pod \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.413433 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n99c4\" (UniqueName: \"kubernetes.io/projected/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-kube-api-access-n99c4\") pod \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.413566 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-inventory\") pod \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\" (UID: \"1ec7b740-1236-48b8-9aa5-0fd0c2f64380\") " Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.422225 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-kube-api-access-n99c4" (OuterVolumeSpecName: "kube-api-access-n99c4") pod "1ec7b740-1236-48b8-9aa5-0fd0c2f64380" (UID: "1ec7b740-1236-48b8-9aa5-0fd0c2f64380"). InnerVolumeSpecName "kube-api-access-n99c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.456355 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ec7b740-1236-48b8-9aa5-0fd0c2f64380" (UID: "1ec7b740-1236-48b8-9aa5-0fd0c2f64380"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.456518 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-inventory" (OuterVolumeSpecName: "inventory") pod "1ec7b740-1236-48b8-9aa5-0fd0c2f64380" (UID: "1ec7b740-1236-48b8-9aa5-0fd0c2f64380"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.515913 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.515937 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.515948 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n99c4\" (UniqueName: \"kubernetes.io/projected/1ec7b740-1236-48b8-9aa5-0fd0c2f64380-kube-api-access-n99c4\") on node \"crc\" DevicePath \"\"" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.920590 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" event={"ID":"1ec7b740-1236-48b8-9aa5-0fd0c2f64380","Type":"ContainerDied","Data":"f56807c6c5ea66ca564d86766ffee2d304375eea4cbe675cdb9929fa2e2c6f40"} Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.920632 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f56807c6c5ea66ca564d86766ffee2d304375eea4cbe675cdb9929fa2e2c6f40" Sep 30 03:22:34 crc kubenswrapper[4744]: I0930 03:22:34.920641 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.019383 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n"] Sep 30 03:22:35 crc kubenswrapper[4744]: E0930 03:22:35.020081 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec7b740-1236-48b8-9aa5-0fd0c2f64380" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.020209 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec7b740-1236-48b8-9aa5-0fd0c2f64380" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.020582 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec7b740-1236-48b8-9aa5-0fd0c2f64380" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.021601 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.023873 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.023903 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.024581 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.025588 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.026140 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.026317 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.026434 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb79k\" (UniqueName: \"kubernetes.io/projected/bb8619b8-471f-4b9c-a9ee-97f668713bec-kube-api-access-cb79k\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.043465 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n"] Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.067910 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vb2lg"] Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.081356 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vb2lg"] Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.127373 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.127525 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.127588 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb79k\" (UniqueName: \"kubernetes.io/projected/bb8619b8-471f-4b9c-a9ee-97f668713bec-kube-api-access-cb79k\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.132461 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.132562 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.155271 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb79k\" (UniqueName: \"kubernetes.io/projected/bb8619b8-471f-4b9c-a9ee-97f668713bec-kube-api-access-cb79k\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.399722 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.524867 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5895c0e5-f156-4a12-952c-3690cf355178" path="/var/lib/kubelet/pods/5895c0e5-f156-4a12-952c-3690cf355178/volumes" Sep 30 03:22:35 crc kubenswrapper[4744]: I0930 03:22:35.993364 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n"] Sep 30 03:22:35 crc kubenswrapper[4744]: W0930 03:22:35.993486 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb8619b8_471f_4b9c_a9ee_97f668713bec.slice/crio-fae86aabaa621aeff741d154edd64374ed43dd58f2d098e0b8d68399fdb16e1e WatchSource:0}: Error finding container fae86aabaa621aeff741d154edd64374ed43dd58f2d098e0b8d68399fdb16e1e: Status 404 returned error can't find the container with id fae86aabaa621aeff741d154edd64374ed43dd58f2d098e0b8d68399fdb16e1e Sep 30 03:22:36 crc kubenswrapper[4744]: I0930 03:22:36.951490 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" event={"ID":"bb8619b8-471f-4b9c-a9ee-97f668713bec","Type":"ContainerStarted","Data":"eb95c5a7c81091a55d90d2cb807800a7e16d4c5bce9684a21b5c3af5d698e866"} Sep 30 03:22:36 crc kubenswrapper[4744]: I0930 03:22:36.951565 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" event={"ID":"bb8619b8-471f-4b9c-a9ee-97f668713bec","Type":"ContainerStarted","Data":"fae86aabaa621aeff741d154edd64374ed43dd58f2d098e0b8d68399fdb16e1e"} Sep 30 03:22:36 crc kubenswrapper[4744]: I0930 03:22:36.974167 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" podStartSLOduration=2.478009411 podStartE2EDuration="2.97413735s" podCreationTimestamp="2025-09-30 03:22:34 +0000 UTC" firstStartedPulling="2025-09-30 03:22:35.995765663 +0000 UTC m=+1683.168985647" lastFinishedPulling="2025-09-30 03:22:36.491893572 +0000 UTC m=+1683.665113586" observedRunningTime="2025-09-30 03:22:36.97122708 +0000 UTC m=+1684.144447084" watchObservedRunningTime="2025-09-30 03:22:36.97413735 +0000 UTC m=+1684.147357364" Sep 30 03:22:38 crc kubenswrapper[4744]: I0930 03:22:38.919290 4744 scope.go:117] "RemoveContainer" containerID="3be0d944db0f753421ffb9b4a3884f87421a49ff54d5be7d33bbe428d030c27f" Sep 30 03:22:38 crc kubenswrapper[4744]: I0930 03:22:38.945553 4744 scope.go:117] "RemoveContainer" containerID="118332e00d4481fed966331aa6b3917ade1485cbc92c6fad7636f4273e0372e6" Sep 30 03:22:38 crc kubenswrapper[4744]: I0930 03:22:38.999458 4744 scope.go:117] "RemoveContainer" containerID="15dcb2a2dea1e11b7f480a9931abdb951390c946451a21ed5ea7cad5f455e442" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.044789 4744 scope.go:117] "RemoveContainer" containerID="e07dd89cc8e93e9a7e605ffb206b57b726361eedeb26262a617fb201d897c707" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.099515 4744 scope.go:117] "RemoveContainer" containerID="fbae2c1ec3e1b84d16b40aa322c78a32377c67e7e61aa749e4aa9dab39f03622" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.145028 4744 scope.go:117] "RemoveContainer" containerID="83f85c7d9ffca4179a51b1746a27f3cc817454ba176d06c15c2850e1eaa8d33f" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.167966 4744 scope.go:117] "RemoveContainer" containerID="c7d5914b4bcc66674797ac0ebbd0c2eb12dd3723f702c9a471449e9418d173e7" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.196797 4744 scope.go:117] "RemoveContainer" containerID="f8cb28a160438fd03a3d29cb6842ae9be12b992183f86c0ea5d6df3ba8b6ca41" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.243800 4744 scope.go:117] "RemoveContainer" containerID="d4e5c065bf105d380933b3bc1a1d87e4790fdc380c41270a9f7b6c4b0c42ccfc" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.265550 4744 scope.go:117] "RemoveContainer" containerID="64de70424d8528e88d8673100eba28fa7cda134b75760beff161eae520e4af96" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.287249 4744 scope.go:117] "RemoveContainer" containerID="49000545f8bd30abbc27e44a25e927d55cab885e11a61196be3c674b05913650" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.312824 4744 scope.go:117] "RemoveContainer" containerID="c1cfd7551f5035168439ba2e5fcccc50cd250d660465e48451afbd34ef80f772" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.341247 4744 scope.go:117] "RemoveContainer" containerID="dfc5c8fdd01c4a3b506511acb2b5f895565ef84d62502bcdfce502fe2d470c12" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.368121 4744 scope.go:117] "RemoveContainer" containerID="fcd7868f22ee9825d41fb46adb29a09efced44d28c972d90e6e7424e3add8b51" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.401986 4744 scope.go:117] "RemoveContainer" containerID="b076b97a7f60a0a3d96488ba33d74bb941f97db0a60f0e614cf1d8f67e80394a" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.440013 4744 scope.go:117] "RemoveContainer" containerID="3ec60ef389fcbbe40a8f1440d201481ad13716fb445a5bdc46859704a772186e" Sep 30 03:22:39 crc kubenswrapper[4744]: I0930 03:22:39.474017 4744 scope.go:117] "RemoveContainer" containerID="3fdbce881c392195b16267d99262d0baf566d2edaeeff38e5061e191b28d3606" Sep 30 03:22:40 crc kubenswrapper[4744]: I0930 03:22:40.103782 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rzbkf"] Sep 30 03:22:40 crc kubenswrapper[4744]: I0930 03:22:40.116119 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rzbkf"] Sep 30 03:22:41 crc kubenswrapper[4744]: I0930 03:22:41.504105 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:22:41 crc kubenswrapper[4744]: E0930 03:22:41.505815 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:22:41 crc kubenswrapper[4744]: I0930 03:22:41.523731 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab1acfa-0313-4621-9d6e-6ab34807d0e5" path="/var/lib/kubelet/pods/dab1acfa-0313-4621-9d6e-6ab34807d0e5/volumes" Sep 30 03:22:52 crc kubenswrapper[4744]: I0930 03:22:52.503997 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:22:52 crc kubenswrapper[4744]: E0930 03:22:52.505086 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:23:03 crc kubenswrapper[4744]: I0930 03:23:03.517222 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:23:03 crc kubenswrapper[4744]: E0930 03:23:03.518826 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:23:06 crc kubenswrapper[4744]: I0930 03:23:06.053599 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-5rqpd"] Sep 30 03:23:06 crc kubenswrapper[4744]: I0930 03:23:06.063313 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-5rqpd"] Sep 30 03:23:07 crc kubenswrapper[4744]: I0930 03:23:07.523407 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b10ecb07-4d75-4842-a753-f76c3a1d3b62" path="/var/lib/kubelet/pods/b10ecb07-4d75-4842-a753-f76c3a1d3b62/volumes" Sep 30 03:23:11 crc kubenswrapper[4744]: I0930 03:23:11.041476 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dpxjq"] Sep 30 03:23:11 crc kubenswrapper[4744]: I0930 03:23:11.055310 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dpxjq"] Sep 30 03:23:11 crc kubenswrapper[4744]: I0930 03:23:11.526428 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d629c05-1300-4fb5-8f08-211a133fffe8" path="/var/lib/kubelet/pods/6d629c05-1300-4fb5-8f08-211a133fffe8/volumes" Sep 30 03:23:14 crc kubenswrapper[4744]: I0930 03:23:14.503154 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:23:14 crc kubenswrapper[4744]: E0930 03:23:14.503700 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:23:15 crc kubenswrapper[4744]: I0930 03:23:15.044825 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nm8l8"] Sep 30 03:23:15 crc kubenswrapper[4744]: I0930 03:23:15.065343 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nm8l8"] Sep 30 03:23:15 crc kubenswrapper[4744]: I0930 03:23:15.524905 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08504742-967f-491a-a3ab-9ddcadb556c4" path="/var/lib/kubelet/pods/08504742-967f-491a-a3ab-9ddcadb556c4/volumes" Sep 30 03:23:25 crc kubenswrapper[4744]: I0930 03:23:25.504758 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:23:25 crc kubenswrapper[4744]: E0930 03:23:25.505871 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:23:27 crc kubenswrapper[4744]: I0930 03:23:27.040530 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bvn26"] Sep 30 03:23:27 crc kubenswrapper[4744]: I0930 03:23:27.058051 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bvn26"] Sep 30 03:23:27 crc kubenswrapper[4744]: I0930 03:23:27.069186 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-75phb"] Sep 30 03:23:27 crc kubenswrapper[4744]: I0930 03:23:27.078684 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-75phb"] Sep 30 03:23:27 crc kubenswrapper[4744]: I0930 03:23:27.518790 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b19763-eb29-45ca-9431-8791543dee83" path="/var/lib/kubelet/pods/72b19763-eb29-45ca-9431-8791543dee83/volumes" Sep 30 03:23:27 crc kubenswrapper[4744]: I0930 03:23:27.520594 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b118b5fa-982e-4bd6-a6dc-5d2015b3b399" path="/var/lib/kubelet/pods/b118b5fa-982e-4bd6-a6dc-5d2015b3b399/volumes" Sep 30 03:23:33 crc kubenswrapper[4744]: I0930 03:23:33.050308 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-pwqjw"] Sep 30 03:23:33 crc kubenswrapper[4744]: I0930 03:23:33.064464 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-pwqjw"] Sep 30 03:23:33 crc kubenswrapper[4744]: I0930 03:23:33.523631 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24c42a2-4afa-4c32-ba87-18251fd1345a" path="/var/lib/kubelet/pods/a24c42a2-4afa-4c32-ba87-18251fd1345a/volumes" Sep 30 03:23:38 crc kubenswrapper[4744]: I0930 03:23:38.505676 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:23:38 crc kubenswrapper[4744]: E0930 03:23:38.506793 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:23:39 crc kubenswrapper[4744]: I0930 03:23:39.854122 4744 scope.go:117] "RemoveContainer" containerID="872fde3c707408ad072c82898450c2626de8273f8aab39941386a7ca03ae0278" Sep 30 03:23:39 crc kubenswrapper[4744]: I0930 03:23:39.905577 4744 scope.go:117] "RemoveContainer" containerID="71e2cf222297c03ad431b910faa1034a2782bd3476c1ee1489728f137ab7ff18" Sep 30 03:23:39 crc kubenswrapper[4744]: I0930 03:23:39.971159 4744 scope.go:117] "RemoveContainer" containerID="dffce1e13b41c3dbfe145e00caa36d90648e6f0d4f561b3a6a262e54cb9bb903" Sep 30 03:23:40 crc kubenswrapper[4744]: I0930 03:23:40.034883 4744 scope.go:117] "RemoveContainer" containerID="3b3026f2af2b2955aa87eab2eb9210451af07b8d9b87fdbe9ffc790294c4aedf" Sep 30 03:23:40 crc kubenswrapper[4744]: I0930 03:23:40.079746 4744 scope.go:117] "RemoveContainer" containerID="736e67b520621626ade8ae88a72df0e7390e168b4daadd51e756193ff9493e52" Sep 30 03:23:40 crc kubenswrapper[4744]: I0930 03:23:40.130640 4744 scope.go:117] "RemoveContainer" containerID="57edffd5dfc895f964c798f7ed818242474b6c5b30da74d79cc484ce03eb0c81" Sep 30 03:23:40 crc kubenswrapper[4744]: I0930 03:23:40.183659 4744 scope.go:117] "RemoveContainer" containerID="869acb7ec54dcc781e8364e30d14cdf03deb15f640b00501fbab0d05595a4f44" Sep 30 03:23:51 crc kubenswrapper[4744]: I0930 03:23:51.503784 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:23:51 crc kubenswrapper[4744]: E0930 03:23:51.504547 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.429166 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2v76c"] Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.434118 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.468071 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v76c"] Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.518276 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-utilities\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.518327 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-catalog-content\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.518413 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxwr2\" (UniqueName: \"kubernetes.io/projected/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-kube-api-access-kxwr2\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.620938 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-utilities\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.620991 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-catalog-content\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.621070 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxwr2\" (UniqueName: \"kubernetes.io/projected/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-kube-api-access-kxwr2\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.621496 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-utilities\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.621520 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-catalog-content\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.657523 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxwr2\" (UniqueName: \"kubernetes.io/projected/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-kube-api-access-kxwr2\") pod \"redhat-marketplace-2v76c\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:57 crc kubenswrapper[4744]: I0930 03:23:57.770484 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:23:58 crc kubenswrapper[4744]: I0930 03:23:58.286687 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v76c"] Sep 30 03:23:58 crc kubenswrapper[4744]: W0930 03:23:58.291142 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f4dd14_c224_4bdc_b9e8_3b749cc9d9fc.slice/crio-f50d6a08fe573bb2a060bc05251ee32edb172bde6c5e7250d00070f61516509d WatchSource:0}: Error finding container f50d6a08fe573bb2a060bc05251ee32edb172bde6c5e7250d00070f61516509d: Status 404 returned error can't find the container with id f50d6a08fe573bb2a060bc05251ee32edb172bde6c5e7250d00070f61516509d Sep 30 03:23:58 crc kubenswrapper[4744]: I0930 03:23:58.983929 4744 generic.go:334] "Generic (PLEG): container finished" podID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerID="a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894" exitCode=0 Sep 30 03:23:58 crc kubenswrapper[4744]: I0930 03:23:58.984002 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v76c" event={"ID":"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc","Type":"ContainerDied","Data":"a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894"} Sep 30 03:23:58 crc kubenswrapper[4744]: I0930 03:23:58.984449 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v76c" event={"ID":"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc","Type":"ContainerStarted","Data":"f50d6a08fe573bb2a060bc05251ee32edb172bde6c5e7250d00070f61516509d"} Sep 30 03:24:00 crc kubenswrapper[4744]: I0930 03:24:00.013109 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v76c" event={"ID":"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc","Type":"ContainerStarted","Data":"6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7"} Sep 30 03:24:01 crc kubenswrapper[4744]: I0930 03:24:01.034859 4744 generic.go:334] "Generic (PLEG): container finished" podID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerID="6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7" exitCode=0 Sep 30 03:24:01 crc kubenswrapper[4744]: I0930 03:24:01.034971 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v76c" event={"ID":"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc","Type":"ContainerDied","Data":"6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7"} Sep 30 03:24:02 crc kubenswrapper[4744]: I0930 03:24:02.054048 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v76c" event={"ID":"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc","Type":"ContainerStarted","Data":"ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51"} Sep 30 03:24:02 crc kubenswrapper[4744]: I0930 03:24:02.081879 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2v76c" podStartSLOduration=2.634685561 podStartE2EDuration="5.081849407s" podCreationTimestamp="2025-09-30 03:23:57 +0000 UTC" firstStartedPulling="2025-09-30 03:23:58.988132533 +0000 UTC m=+1766.161352547" lastFinishedPulling="2025-09-30 03:24:01.435296368 +0000 UTC m=+1768.608516393" observedRunningTime="2025-09-30 03:24:02.077710269 +0000 UTC m=+1769.250930283" watchObservedRunningTime="2025-09-30 03:24:02.081849407 +0000 UTC m=+1769.255069431" Sep 30 03:24:06 crc kubenswrapper[4744]: I0930 03:24:06.504541 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:24:06 crc kubenswrapper[4744]: E0930 03:24:06.505920 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:24:07 crc kubenswrapper[4744]: I0930 03:24:07.770653 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:24:07 crc kubenswrapper[4744]: I0930 03:24:07.771093 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:24:07 crc kubenswrapper[4744]: I0930 03:24:07.855096 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:24:08 crc kubenswrapper[4744]: I0930 03:24:08.131949 4744 generic.go:334] "Generic (PLEG): container finished" podID="bb8619b8-471f-4b9c-a9ee-97f668713bec" containerID="eb95c5a7c81091a55d90d2cb807800a7e16d4c5bce9684a21b5c3af5d698e866" exitCode=0 Sep 30 03:24:08 crc kubenswrapper[4744]: I0930 03:24:08.132046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" event={"ID":"bb8619b8-471f-4b9c-a9ee-97f668713bec","Type":"ContainerDied","Data":"eb95c5a7c81091a55d90d2cb807800a7e16d4c5bce9684a21b5c3af5d698e866"} Sep 30 03:24:08 crc kubenswrapper[4744]: I0930 03:24:08.227673 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:24:08 crc kubenswrapper[4744]: I0930 03:24:08.300992 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v76c"] Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.657815 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.722209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-inventory\") pod \"bb8619b8-471f-4b9c-a9ee-97f668713bec\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.722314 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb79k\" (UniqueName: \"kubernetes.io/projected/bb8619b8-471f-4b9c-a9ee-97f668713bec-kube-api-access-cb79k\") pod \"bb8619b8-471f-4b9c-a9ee-97f668713bec\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.722628 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-ssh-key\") pod \"bb8619b8-471f-4b9c-a9ee-97f668713bec\" (UID: \"bb8619b8-471f-4b9c-a9ee-97f668713bec\") " Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.733874 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8619b8-471f-4b9c-a9ee-97f668713bec-kube-api-access-cb79k" (OuterVolumeSpecName: "kube-api-access-cb79k") pod "bb8619b8-471f-4b9c-a9ee-97f668713bec" (UID: "bb8619b8-471f-4b9c-a9ee-97f668713bec"). InnerVolumeSpecName "kube-api-access-cb79k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.773215 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-inventory" (OuterVolumeSpecName: "inventory") pod "bb8619b8-471f-4b9c-a9ee-97f668713bec" (UID: "bb8619b8-471f-4b9c-a9ee-97f668713bec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.773268 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb8619b8-471f-4b9c-a9ee-97f668713bec" (UID: "bb8619b8-471f-4b9c-a9ee-97f668713bec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.826196 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.826236 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb8619b8-471f-4b9c-a9ee-97f668713bec-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:09 crc kubenswrapper[4744]: I0930 03:24:09.826251 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb79k\" (UniqueName: \"kubernetes.io/projected/bb8619b8-471f-4b9c-a9ee-97f668713bec-kube-api-access-cb79k\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.166281 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" event={"ID":"bb8619b8-471f-4b9c-a9ee-97f668713bec","Type":"ContainerDied","Data":"fae86aabaa621aeff741d154edd64374ed43dd58f2d098e0b8d68399fdb16e1e"} Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.166342 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fae86aabaa621aeff741d154edd64374ed43dd58f2d098e0b8d68399fdb16e1e" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.166351 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.166632 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2v76c" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="registry-server" containerID="cri-o://ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51" gracePeriod=2 Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.293153 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb"] Sep 30 03:24:10 crc kubenswrapper[4744]: E0930 03:24:10.293674 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8619b8-471f-4b9c-a9ee-97f668713bec" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.293695 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8619b8-471f-4b9c-a9ee-97f668713bec" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.293959 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8619b8-471f-4b9c-a9ee-97f668713bec" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.294805 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.301615 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb"] Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.331893 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.332290 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.332461 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.332578 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.443389 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fndwc\" (UniqueName: \"kubernetes.io/projected/287b3de6-0593-428e-80d8-b70b360a7d41-kube-api-access-fndwc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.443562 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.443652 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.545308 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fndwc\" (UniqueName: \"kubernetes.io/projected/287b3de6-0593-428e-80d8-b70b360a7d41-kube-api-access-fndwc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.545391 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.545432 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.553095 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.557093 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.562826 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fndwc\" (UniqueName: \"kubernetes.io/projected/287b3de6-0593-428e-80d8-b70b360a7d41-kube-api-access-fndwc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.653240 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.673481 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.747887 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxwr2\" (UniqueName: \"kubernetes.io/projected/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-kube-api-access-kxwr2\") pod \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.747988 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-utilities\") pod \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.748015 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-catalog-content\") pod \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\" (UID: \"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc\") " Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.750526 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-utilities" (OuterVolumeSpecName: "utilities") pod "70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" (UID: "70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.761520 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" (UID: "70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.777712 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-kube-api-access-kxwr2" (OuterVolumeSpecName: "kube-api-access-kxwr2") pod "70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" (UID: "70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc"). InnerVolumeSpecName "kube-api-access-kxwr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.856613 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxwr2\" (UniqueName: \"kubernetes.io/projected/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-kube-api-access-kxwr2\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.856672 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:10 crc kubenswrapper[4744]: I0930 03:24:10.856685 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.182494 4744 generic.go:334] "Generic (PLEG): container finished" podID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerID="ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51" exitCode=0 Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.182561 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v76c" event={"ID":"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc","Type":"ContainerDied","Data":"ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51"} Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.182600 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v76c" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.182637 4744 scope.go:117] "RemoveContainer" containerID="ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.182622 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v76c" event={"ID":"70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc","Type":"ContainerDied","Data":"f50d6a08fe573bb2a060bc05251ee32edb172bde6c5e7250d00070f61516509d"} Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.212567 4744 scope.go:117] "RemoveContainer" containerID="6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.236181 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v76c"] Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.246966 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v76c"] Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.264070 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb"] Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.269266 4744 scope.go:117] "RemoveContainer" containerID="a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.293087 4744 scope.go:117] "RemoveContainer" containerID="ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51" Sep 30 03:24:11 crc kubenswrapper[4744]: E0930 03:24:11.293609 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51\": container with ID starting with ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51 not found: ID does not exist" containerID="ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.293664 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51"} err="failed to get container status \"ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51\": rpc error: code = NotFound desc = could not find container \"ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51\": container with ID starting with ec114b31af5833818943cedf989823920375099805572937beca23afc4a59d51 not found: ID does not exist" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.293694 4744 scope.go:117] "RemoveContainer" containerID="6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7" Sep 30 03:24:11 crc kubenswrapper[4744]: E0930 03:24:11.294163 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7\": container with ID starting with 6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7 not found: ID does not exist" containerID="6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.294199 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7"} err="failed to get container status \"6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7\": rpc error: code = NotFound desc = could not find container \"6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7\": container with ID starting with 6dd0ffe2be0d0d10339109e64d686ea3055f7339f4ef6de4a7c968d3fa6cdad7 not found: ID does not exist" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.294220 4744 scope.go:117] "RemoveContainer" containerID="a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894" Sep 30 03:24:11 crc kubenswrapper[4744]: E0930 03:24:11.294604 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894\": container with ID starting with a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894 not found: ID does not exist" containerID="a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.294656 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894"} err="failed to get container status \"a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894\": rpc error: code = NotFound desc = could not find container \"a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894\": container with ID starting with a4d0652a50656f2178cae58f0b4166541c8024f915d4928f34f59e6500b01894 not found: ID does not exist" Sep 30 03:24:11 crc kubenswrapper[4744]: I0930 03:24:11.524341 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" path="/var/lib/kubelet/pods/70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc/volumes" Sep 30 03:24:12 crc kubenswrapper[4744]: I0930 03:24:12.207011 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" event={"ID":"287b3de6-0593-428e-80d8-b70b360a7d41","Type":"ContainerStarted","Data":"effb0aef6841ee1535044febb2cd32b2c0468ff3a092c8942c642828b93ab0cd"} Sep 30 03:24:12 crc kubenswrapper[4744]: I0930 03:24:12.207363 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" event={"ID":"287b3de6-0593-428e-80d8-b70b360a7d41","Type":"ContainerStarted","Data":"c4cc43ddf43983079fb9e545611af172e51e18cf67bb095d2b434559f2777662"} Sep 30 03:24:12 crc kubenswrapper[4744]: I0930 03:24:12.248401 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" podStartSLOduration=1.7845807439999999 podStartE2EDuration="2.248364049s" podCreationTimestamp="2025-09-30 03:24:10 +0000 UTC" firstStartedPulling="2025-09-30 03:24:11.293503932 +0000 UTC m=+1778.466723906" lastFinishedPulling="2025-09-30 03:24:11.757287227 +0000 UTC m=+1778.930507211" observedRunningTime="2025-09-30 03:24:12.234408296 +0000 UTC m=+1779.407628270" watchObservedRunningTime="2025-09-30 03:24:12.248364049 +0000 UTC m=+1779.421584033" Sep 30 03:24:14 crc kubenswrapper[4744]: I0930 03:24:14.053562 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7cwff"] Sep 30 03:24:14 crc kubenswrapper[4744]: I0930 03:24:14.071926 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-swpzg"] Sep 30 03:24:14 crc kubenswrapper[4744]: I0930 03:24:14.081264 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-brwgn"] Sep 30 03:24:14 crc kubenswrapper[4744]: I0930 03:24:14.088119 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7cwff"] Sep 30 03:24:14 crc kubenswrapper[4744]: I0930 03:24:14.094416 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-brwgn"] Sep 30 03:24:14 crc kubenswrapper[4744]: I0930 03:24:14.101002 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-swpzg"] Sep 30 03:24:15 crc kubenswrapper[4744]: I0930 03:24:15.525320 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292dafaf-ce57-46a1-8430-7d4f7baf831c" path="/var/lib/kubelet/pods/292dafaf-ce57-46a1-8430-7d4f7baf831c/volumes" Sep 30 03:24:15 crc kubenswrapper[4744]: I0930 03:24:15.526869 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435ec201-6b70-44c7-bafa-9e203803ed2b" path="/var/lib/kubelet/pods/435ec201-6b70-44c7-bafa-9e203803ed2b/volumes" Sep 30 03:24:15 crc kubenswrapper[4744]: I0930 03:24:15.528132 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f73e88-9ffe-4d46-9869-9e3b22e2054e" path="/var/lib/kubelet/pods/88f73e88-9ffe-4d46-9869-9e3b22e2054e/volumes" Sep 30 03:24:17 crc kubenswrapper[4744]: I0930 03:24:17.299016 4744 generic.go:334] "Generic (PLEG): container finished" podID="287b3de6-0593-428e-80d8-b70b360a7d41" containerID="effb0aef6841ee1535044febb2cd32b2c0468ff3a092c8942c642828b93ab0cd" exitCode=0 Sep 30 03:24:17 crc kubenswrapper[4744]: I0930 03:24:17.299583 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" event={"ID":"287b3de6-0593-428e-80d8-b70b360a7d41","Type":"ContainerDied","Data":"effb0aef6841ee1535044febb2cd32b2c0468ff3a092c8942c642828b93ab0cd"} Sep 30 03:24:18 crc kubenswrapper[4744]: I0930 03:24:18.832761 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:18 crc kubenswrapper[4744]: I0930 03:24:18.945997 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-ssh-key\") pod \"287b3de6-0593-428e-80d8-b70b360a7d41\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " Sep 30 03:24:18 crc kubenswrapper[4744]: I0930 03:24:18.946041 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-inventory\") pod \"287b3de6-0593-428e-80d8-b70b360a7d41\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " Sep 30 03:24:18 crc kubenswrapper[4744]: I0930 03:24:18.946088 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fndwc\" (UniqueName: \"kubernetes.io/projected/287b3de6-0593-428e-80d8-b70b360a7d41-kube-api-access-fndwc\") pod \"287b3de6-0593-428e-80d8-b70b360a7d41\" (UID: \"287b3de6-0593-428e-80d8-b70b360a7d41\") " Sep 30 03:24:18 crc kubenswrapper[4744]: I0930 03:24:18.953327 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287b3de6-0593-428e-80d8-b70b360a7d41-kube-api-access-fndwc" (OuterVolumeSpecName: "kube-api-access-fndwc") pod "287b3de6-0593-428e-80d8-b70b360a7d41" (UID: "287b3de6-0593-428e-80d8-b70b360a7d41"). InnerVolumeSpecName "kube-api-access-fndwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:24:18 crc kubenswrapper[4744]: I0930 03:24:18.981570 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-inventory" (OuterVolumeSpecName: "inventory") pod "287b3de6-0593-428e-80d8-b70b360a7d41" (UID: "287b3de6-0593-428e-80d8-b70b360a7d41"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:24:18 crc kubenswrapper[4744]: I0930 03:24:18.984347 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "287b3de6-0593-428e-80d8-b70b360a7d41" (UID: "287b3de6-0593-428e-80d8-b70b360a7d41"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.047888 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fndwc\" (UniqueName: \"kubernetes.io/projected/287b3de6-0593-428e-80d8-b70b360a7d41-kube-api-access-fndwc\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.047917 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.047929 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287b3de6-0593-428e-80d8-b70b360a7d41-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.321655 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" event={"ID":"287b3de6-0593-428e-80d8-b70b360a7d41","Type":"ContainerDied","Data":"c4cc43ddf43983079fb9e545611af172e51e18cf67bb095d2b434559f2777662"} Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.321723 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4cc43ddf43983079fb9e545611af172e51e18cf67bb095d2b434559f2777662" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.321801 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.413967 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6"] Sep 30 03:24:19 crc kubenswrapper[4744]: E0930 03:24:19.414420 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287b3de6-0593-428e-80d8-b70b360a7d41" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.414707 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="287b3de6-0593-428e-80d8-b70b360a7d41" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 03:24:19 crc kubenswrapper[4744]: E0930 03:24:19.414720 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="extract-utilities" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.414729 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="extract-utilities" Sep 30 03:24:19 crc kubenswrapper[4744]: E0930 03:24:19.414751 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="extract-content" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.414760 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="extract-content" Sep 30 03:24:19 crc kubenswrapper[4744]: E0930 03:24:19.414775 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="registry-server" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.414783 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="registry-server" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.415032 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f4dd14-c224-4bdc-b9e8-3b749cc9d9fc" containerName="registry-server" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.415044 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="287b3de6-0593-428e-80d8-b70b360a7d41" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.415806 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.419056 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.424634 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.424924 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.426883 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.449196 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6"] Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.503951 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:24:19 crc kubenswrapper[4744]: E0930 03:24:19.504258 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.565803 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.565932 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.566044 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5wcq\" (UniqueName: \"kubernetes.io/projected/8a000c1d-f61a-4bb0-8041-acf07914d4de-kube-api-access-q5wcq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.668025 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5wcq\" (UniqueName: \"kubernetes.io/projected/8a000c1d-f61a-4bb0-8041-acf07914d4de-kube-api-access-q5wcq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.668283 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.669641 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.672162 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.672508 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.687208 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5wcq\" (UniqueName: \"kubernetes.io/projected/8a000c1d-f61a-4bb0-8041-acf07914d4de-kube-api-access-q5wcq\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-frfr6\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:19 crc kubenswrapper[4744]: I0930 03:24:19.740263 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:24:20 crc kubenswrapper[4744]: I0930 03:24:20.307417 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6"] Sep 30 03:24:20 crc kubenswrapper[4744]: W0930 03:24:20.313743 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a000c1d_f61a_4bb0_8041_acf07914d4de.slice/crio-3cf711f33b8a7684899e60df325545c0614b5d93dc0eca23b1a26b267ef6cecf WatchSource:0}: Error finding container 3cf711f33b8a7684899e60df325545c0614b5d93dc0eca23b1a26b267ef6cecf: Status 404 returned error can't find the container with id 3cf711f33b8a7684899e60df325545c0614b5d93dc0eca23b1a26b267ef6cecf Sep 30 03:24:20 crc kubenswrapper[4744]: I0930 03:24:20.336359 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" event={"ID":"8a000c1d-f61a-4bb0-8041-acf07914d4de","Type":"ContainerStarted","Data":"3cf711f33b8a7684899e60df325545c0614b5d93dc0eca23b1a26b267ef6cecf"} Sep 30 03:24:21 crc kubenswrapper[4744]: I0930 03:24:21.352127 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" event={"ID":"8a000c1d-f61a-4bb0-8041-acf07914d4de","Type":"ContainerStarted","Data":"68c630071e0794fba5310b24d9cade6bfb55d6092dfefadae5770d788206d614"} Sep 30 03:24:21 crc kubenswrapper[4744]: I0930 03:24:21.386577 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" podStartSLOduration=1.913675668 podStartE2EDuration="2.386546784s" podCreationTimestamp="2025-09-30 03:24:19 +0000 UTC" firstStartedPulling="2025-09-30 03:24:20.316794441 +0000 UTC m=+1787.490014425" lastFinishedPulling="2025-09-30 03:24:20.789665537 +0000 UTC m=+1787.962885541" observedRunningTime="2025-09-30 03:24:21.372913111 +0000 UTC m=+1788.546133125" watchObservedRunningTime="2025-09-30 03:24:21.386546784 +0000 UTC m=+1788.559766788" Sep 30 03:24:23 crc kubenswrapper[4744]: I0930 03:24:23.036015 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-22c7-account-create-qnv8l"] Sep 30 03:24:23 crc kubenswrapper[4744]: I0930 03:24:23.051941 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-22c7-account-create-qnv8l"] Sep 30 03:24:23 crc kubenswrapper[4744]: I0930 03:24:23.524923 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8" path="/var/lib/kubelet/pods/21dd3df4-6b6f-48ea-80fa-8a9d9c6785e8/volumes" Sep 30 03:24:24 crc kubenswrapper[4744]: I0930 03:24:24.030619 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e6bc-account-create-n922k"] Sep 30 03:24:24 crc kubenswrapper[4744]: I0930 03:24:24.048017 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e6bc-account-create-n922k"] Sep 30 03:24:24 crc kubenswrapper[4744]: I0930 03:24:24.056531 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-428f-account-create-wnn55"] Sep 30 03:24:24 crc kubenswrapper[4744]: I0930 03:24:24.065416 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-428f-account-create-wnn55"] Sep 30 03:24:25 crc kubenswrapper[4744]: I0930 03:24:25.525780 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3465d7e4-f246-47a0-a809-d690670848f5" path="/var/lib/kubelet/pods/3465d7e4-f246-47a0-a809-d690670848f5/volumes" Sep 30 03:24:25 crc kubenswrapper[4744]: I0930 03:24:25.527801 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3272d22-2919-4fb7-98ea-9193216bcbd3" path="/var/lib/kubelet/pods/d3272d22-2919-4fb7-98ea-9193216bcbd3/volumes" Sep 30 03:24:30 crc kubenswrapper[4744]: I0930 03:24:30.504934 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:24:30 crc kubenswrapper[4744]: E0930 03:24:30.506145 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:24:40 crc kubenswrapper[4744]: I0930 03:24:40.380434 4744 scope.go:117] "RemoveContainer" containerID="356de144e3b104e081ed63954db6371fab98cc4d2223df6981a4a9a4f20b3176" Sep 30 03:24:40 crc kubenswrapper[4744]: I0930 03:24:40.419777 4744 scope.go:117] "RemoveContainer" containerID="0ed099b0bb12b047fec70953124f0245395bac3e510e85b762650789a71d8d71" Sep 30 03:24:40 crc kubenswrapper[4744]: I0930 03:24:40.475144 4744 scope.go:117] "RemoveContainer" containerID="be7cae86c5eccb5a76e18067e5661cfc3b620a095dac057bb3b488c517e2144f" Sep 30 03:24:40 crc kubenswrapper[4744]: I0930 03:24:40.513850 4744 scope.go:117] "RemoveContainer" containerID="137c7c9fe46767d0b0d131f578e64c5697d5242268518b209b4b739a02626e73" Sep 30 03:24:40 crc kubenswrapper[4744]: I0930 03:24:40.595831 4744 scope.go:117] "RemoveContainer" containerID="bca3bced2322eef2371d5365d9db01ab72bd9ef9ea1799eda234c24c262ddc2f" Sep 30 03:24:40 crc kubenswrapper[4744]: I0930 03:24:40.625249 4744 scope.go:117] "RemoveContainer" containerID="eb8870a882743cbec0d66513f609ccacc5d94b1a910a483be2779f2102f8c675" Sep 30 03:24:41 crc kubenswrapper[4744]: I0930 03:24:41.504609 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:24:42 crc kubenswrapper[4744]: I0930 03:24:42.615932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"fe2e0c31b2f11e084705476ec0ebe78e94be3c2f8bdcb24273e81b7e0e5969e9"} Sep 30 03:24:49 crc kubenswrapper[4744]: I0930 03:24:49.058783 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77vc8"] Sep 30 03:24:49 crc kubenswrapper[4744]: I0930 03:24:49.067790 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-77vc8"] Sep 30 03:24:49 crc kubenswrapper[4744]: I0930 03:24:49.525169 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e0e876-4407-4f01-8f41-31a30f8dbb93" path="/var/lib/kubelet/pods/23e0e876-4407-4f01-8f41-31a30f8dbb93/volumes" Sep 30 03:25:00 crc kubenswrapper[4744]: I0930 03:25:00.821776 4744 generic.go:334] "Generic (PLEG): container finished" podID="8a000c1d-f61a-4bb0-8041-acf07914d4de" containerID="68c630071e0794fba5310b24d9cade6bfb55d6092dfefadae5770d788206d614" exitCode=0 Sep 30 03:25:00 crc kubenswrapper[4744]: I0930 03:25:00.821883 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" event={"ID":"8a000c1d-f61a-4bb0-8041-acf07914d4de","Type":"ContainerDied","Data":"68c630071e0794fba5310b24d9cade6bfb55d6092dfefadae5770d788206d614"} Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.307722 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.396768 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5wcq\" (UniqueName: \"kubernetes.io/projected/8a000c1d-f61a-4bb0-8041-acf07914d4de-kube-api-access-q5wcq\") pod \"8a000c1d-f61a-4bb0-8041-acf07914d4de\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.396826 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-inventory\") pod \"8a000c1d-f61a-4bb0-8041-acf07914d4de\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.397049 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-ssh-key\") pod \"8a000c1d-f61a-4bb0-8041-acf07914d4de\" (UID: \"8a000c1d-f61a-4bb0-8041-acf07914d4de\") " Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.404264 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a000c1d-f61a-4bb0-8041-acf07914d4de-kube-api-access-q5wcq" (OuterVolumeSpecName: "kube-api-access-q5wcq") pod "8a000c1d-f61a-4bb0-8041-acf07914d4de" (UID: "8a000c1d-f61a-4bb0-8041-acf07914d4de"). InnerVolumeSpecName "kube-api-access-q5wcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.424892 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8a000c1d-f61a-4bb0-8041-acf07914d4de" (UID: "8a000c1d-f61a-4bb0-8041-acf07914d4de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.445043 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-inventory" (OuterVolumeSpecName: "inventory") pod "8a000c1d-f61a-4bb0-8041-acf07914d4de" (UID: "8a000c1d-f61a-4bb0-8041-acf07914d4de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.498827 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5wcq\" (UniqueName: \"kubernetes.io/projected/8a000c1d-f61a-4bb0-8041-acf07914d4de-kube-api-access-q5wcq\") on node \"crc\" DevicePath \"\"" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.498860 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.498870 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a000c1d-f61a-4bb0-8041-acf07914d4de-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.851500 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" event={"ID":"8a000c1d-f61a-4bb0-8041-acf07914d4de","Type":"ContainerDied","Data":"3cf711f33b8a7684899e60df325545c0614b5d93dc0eca23b1a26b267ef6cecf"} Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.851567 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cf711f33b8a7684899e60df325545c0614b5d93dc0eca23b1a26b267ef6cecf" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.851587 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-frfr6" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.965649 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d"] Sep 30 03:25:02 crc kubenswrapper[4744]: E0930 03:25:02.966585 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a000c1d-f61a-4bb0-8041-acf07914d4de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.966625 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a000c1d-f61a-4bb0-8041-acf07914d4de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.967083 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a000c1d-f61a-4bb0-8041-acf07914d4de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.968905 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.971485 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.974200 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d"] Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.974977 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.975134 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:25:02 crc kubenswrapper[4744]: I0930 03:25:02.975244 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.012895 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.012989 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.013077 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkwqd\" (UniqueName: \"kubernetes.io/projected/3c88be7c-d782-4d4c-9110-997c89d8261e-kube-api-access-wkwqd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.114954 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.115103 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.115231 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkwqd\" (UniqueName: \"kubernetes.io/projected/3c88be7c-d782-4d4c-9110-997c89d8261e-kube-api-access-wkwqd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.123824 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.124321 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.145418 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkwqd\" (UniqueName: \"kubernetes.io/projected/3c88be7c-d782-4d4c-9110-997c89d8261e-kube-api-access-wkwqd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp49d\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.307224 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.738149 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d"] Sep 30 03:25:03 crc kubenswrapper[4744]: I0930 03:25:03.864544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" event={"ID":"3c88be7c-d782-4d4c-9110-997c89d8261e","Type":"ContainerStarted","Data":"6683b9c99d8db51a807aad15f4e531614dd4c89481ad56701fc54d9c1b43af6a"} Sep 30 03:25:04 crc kubenswrapper[4744]: I0930 03:25:04.875907 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" event={"ID":"3c88be7c-d782-4d4c-9110-997c89d8261e","Type":"ContainerStarted","Data":"bd902da7665b6ee8654a7b542a5e8cdb974bbf91c279c3b2bc0e12ff7c3c563c"} Sep 30 03:25:04 crc kubenswrapper[4744]: I0930 03:25:04.897352 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" podStartSLOduration=2.329041246 podStartE2EDuration="2.897330945s" podCreationTimestamp="2025-09-30 03:25:02 +0000 UTC" firstStartedPulling="2025-09-30 03:25:03.746001309 +0000 UTC m=+1830.919221293" lastFinishedPulling="2025-09-30 03:25:04.314291008 +0000 UTC m=+1831.487510992" observedRunningTime="2025-09-30 03:25:04.896854951 +0000 UTC m=+1832.070074925" watchObservedRunningTime="2025-09-30 03:25:04.897330945 +0000 UTC m=+1832.070550919" Sep 30 03:25:06 crc kubenswrapper[4744]: I0930 03:25:06.052520 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6jc7w"] Sep 30 03:25:06 crc kubenswrapper[4744]: I0930 03:25:06.063857 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpjh8"] Sep 30 03:25:06 crc kubenswrapper[4744]: I0930 03:25:06.076750 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6jc7w"] Sep 30 03:25:06 crc kubenswrapper[4744]: I0930 03:25:06.088231 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpjh8"] Sep 30 03:25:07 crc kubenswrapper[4744]: I0930 03:25:07.518299 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f23037-0380-4f1a-adc4-8ef59910c1f6" path="/var/lib/kubelet/pods/98f23037-0380-4f1a-adc4-8ef59910c1f6/volumes" Sep 30 03:25:07 crc kubenswrapper[4744]: I0930 03:25:07.519425 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbfbe619-9f32-4938-ab7c-e32cdbbdf94e" path="/var/lib/kubelet/pods/cbfbe619-9f32-4938-ab7c-e32cdbbdf94e/volumes" Sep 30 03:25:40 crc kubenswrapper[4744]: I0930 03:25:40.844848 4744 scope.go:117] "RemoveContainer" containerID="70b57b054706f2fc74bd01ff06ef4f9fccfd73207ffdca04da3a240618336cb0" Sep 30 03:25:40 crc kubenswrapper[4744]: I0930 03:25:40.897971 4744 scope.go:117] "RemoveContainer" containerID="3250bea0036aed0d33edbfb5fc8a9a3369cf28bc3db4c5e4149f5b741d8fd39a" Sep 30 03:25:40 crc kubenswrapper[4744]: I0930 03:25:40.963287 4744 scope.go:117] "RemoveContainer" containerID="356fb22a7828dc96b7a56c602cb2a3a32fc45c74aad761c67b031451a0dde60f" Sep 30 03:25:50 crc kubenswrapper[4744]: I0930 03:25:50.054868 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pr7xb"] Sep 30 03:25:50 crc kubenswrapper[4744]: I0930 03:25:50.064750 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pr7xb"] Sep 30 03:25:51 crc kubenswrapper[4744]: I0930 03:25:51.513355 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b286f88e-e3f3-4730-b831-4db33fb09a99" path="/var/lib/kubelet/pods/b286f88e-e3f3-4730-b831-4db33fb09a99/volumes" Sep 30 03:26:00 crc kubenswrapper[4744]: I0930 03:26:00.499573 4744 generic.go:334] "Generic (PLEG): container finished" podID="3c88be7c-d782-4d4c-9110-997c89d8261e" containerID="bd902da7665b6ee8654a7b542a5e8cdb974bbf91c279c3b2bc0e12ff7c3c563c" exitCode=0 Sep 30 03:26:00 crc kubenswrapper[4744]: I0930 03:26:00.499668 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" event={"ID":"3c88be7c-d782-4d4c-9110-997c89d8261e","Type":"ContainerDied","Data":"bd902da7665b6ee8654a7b542a5e8cdb974bbf91c279c3b2bc0e12ff7c3c563c"} Sep 30 03:26:01 crc kubenswrapper[4744]: I0930 03:26:01.982201 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.067632 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-ssh-key\") pod \"3c88be7c-d782-4d4c-9110-997c89d8261e\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.067920 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-inventory\") pod \"3c88be7c-d782-4d4c-9110-997c89d8261e\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.068870 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkwqd\" (UniqueName: \"kubernetes.io/projected/3c88be7c-d782-4d4c-9110-997c89d8261e-kube-api-access-wkwqd\") pod \"3c88be7c-d782-4d4c-9110-997c89d8261e\" (UID: \"3c88be7c-d782-4d4c-9110-997c89d8261e\") " Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.076098 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c88be7c-d782-4d4c-9110-997c89d8261e-kube-api-access-wkwqd" (OuterVolumeSpecName: "kube-api-access-wkwqd") pod "3c88be7c-d782-4d4c-9110-997c89d8261e" (UID: "3c88be7c-d782-4d4c-9110-997c89d8261e"). InnerVolumeSpecName "kube-api-access-wkwqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.122842 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-inventory" (OuterVolumeSpecName: "inventory") pod "3c88be7c-d782-4d4c-9110-997c89d8261e" (UID: "3c88be7c-d782-4d4c-9110-997c89d8261e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.125934 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c88be7c-d782-4d4c-9110-997c89d8261e" (UID: "3c88be7c-d782-4d4c-9110-997c89d8261e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.180142 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkwqd\" (UniqueName: \"kubernetes.io/projected/3c88be7c-d782-4d4c-9110-997c89d8261e-kube-api-access-wkwqd\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.180207 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.180227 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c88be7c-d782-4d4c-9110-997c89d8261e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.523351 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" event={"ID":"3c88be7c-d782-4d4c-9110-997c89d8261e","Type":"ContainerDied","Data":"6683b9c99d8db51a807aad15f4e531614dd4c89481ad56701fc54d9c1b43af6a"} Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.523734 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6683b9c99d8db51a807aad15f4e531614dd4c89481ad56701fc54d9c1b43af6a" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.523496 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp49d" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.638044 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4lhn"] Sep 30 03:26:02 crc kubenswrapper[4744]: E0930 03:26:02.641164 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c88be7c-d782-4d4c-9110-997c89d8261e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.641192 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c88be7c-d782-4d4c-9110-997c89d8261e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.641471 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c88be7c-d782-4d4c-9110-997c89d8261e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.642299 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.645742 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.645775 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.645862 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.650244 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.651554 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4lhn"] Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.693968 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphxn\" (UniqueName: \"kubernetes.io/projected/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-kube-api-access-qphxn\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.694140 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.694196 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.795434 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.795504 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.795620 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qphxn\" (UniqueName: \"kubernetes.io/projected/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-kube-api-access-qphxn\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.800246 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.801241 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.813552 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphxn\" (UniqueName: \"kubernetes.io/projected/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-kube-api-access-qphxn\") pod \"ssh-known-hosts-edpm-deployment-p4lhn\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:02 crc kubenswrapper[4744]: I0930 03:26:02.969394 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:03 crc kubenswrapper[4744]: I0930 03:26:03.607579 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4lhn"] Sep 30 03:26:03 crc kubenswrapper[4744]: I0930 03:26:03.618692 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:26:04 crc kubenswrapper[4744]: I0930 03:26:04.558543 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" event={"ID":"3aa13f38-7b2d-4f65-8ca1-0de736d1f291","Type":"ContainerStarted","Data":"623be28c878f94c4da6a5c10688606899843033412a429568de3d53bdeca2fc6"} Sep 30 03:26:04 crc kubenswrapper[4744]: I0930 03:26:04.559607 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" event={"ID":"3aa13f38-7b2d-4f65-8ca1-0de736d1f291","Type":"ContainerStarted","Data":"2f9acc86715a01bd1f7c832e7db1d02276b83117895f23d7aad2f0fb9ec91530"} Sep 30 03:26:04 crc kubenswrapper[4744]: I0930 03:26:04.597653 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" podStartSLOduration=2.149717354 podStartE2EDuration="2.597634841s" podCreationTimestamp="2025-09-30 03:26:02 +0000 UTC" firstStartedPulling="2025-09-30 03:26:03.618077388 +0000 UTC m=+1890.791297402" lastFinishedPulling="2025-09-30 03:26:04.065994905 +0000 UTC m=+1891.239214889" observedRunningTime="2025-09-30 03:26:04.582649617 +0000 UTC m=+1891.755869591" watchObservedRunningTime="2025-09-30 03:26:04.597634841 +0000 UTC m=+1891.770854805" Sep 30 03:26:12 crc kubenswrapper[4744]: I0930 03:26:12.654874 4744 generic.go:334] "Generic (PLEG): container finished" podID="3aa13f38-7b2d-4f65-8ca1-0de736d1f291" containerID="623be28c878f94c4da6a5c10688606899843033412a429568de3d53bdeca2fc6" exitCode=0 Sep 30 03:26:12 crc kubenswrapper[4744]: I0930 03:26:12.655465 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" event={"ID":"3aa13f38-7b2d-4f65-8ca1-0de736d1f291","Type":"ContainerDied","Data":"623be28c878f94c4da6a5c10688606899843033412a429568de3d53bdeca2fc6"} Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.141088 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.256919 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qphxn\" (UniqueName: \"kubernetes.io/projected/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-kube-api-access-qphxn\") pod \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.257079 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-ssh-key-openstack-edpm-ipam\") pod \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.257286 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-inventory-0\") pod \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\" (UID: \"3aa13f38-7b2d-4f65-8ca1-0de736d1f291\") " Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.261935 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-kube-api-access-qphxn" (OuterVolumeSpecName: "kube-api-access-qphxn") pod "3aa13f38-7b2d-4f65-8ca1-0de736d1f291" (UID: "3aa13f38-7b2d-4f65-8ca1-0de736d1f291"). InnerVolumeSpecName "kube-api-access-qphxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.283386 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3aa13f38-7b2d-4f65-8ca1-0de736d1f291" (UID: "3aa13f38-7b2d-4f65-8ca1-0de736d1f291"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.294929 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3aa13f38-7b2d-4f65-8ca1-0de736d1f291" (UID: "3aa13f38-7b2d-4f65-8ca1-0de736d1f291"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.360030 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.360332 4744 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.360621 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qphxn\" (UniqueName: \"kubernetes.io/projected/3aa13f38-7b2d-4f65-8ca1-0de736d1f291-kube-api-access-qphxn\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.683709 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" event={"ID":"3aa13f38-7b2d-4f65-8ca1-0de736d1f291","Type":"ContainerDied","Data":"2f9acc86715a01bd1f7c832e7db1d02276b83117895f23d7aad2f0fb9ec91530"} Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.683755 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f9acc86715a01bd1f7c832e7db1d02276b83117895f23d7aad2f0fb9ec91530" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.683766 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4lhn" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.744917 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7"] Sep 30 03:26:14 crc kubenswrapper[4744]: E0930 03:26:14.745321 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa13f38-7b2d-4f65-8ca1-0de736d1f291" containerName="ssh-known-hosts-edpm-deployment" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.745337 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa13f38-7b2d-4f65-8ca1-0de736d1f291" containerName="ssh-known-hosts-edpm-deployment" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.745557 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa13f38-7b2d-4f65-8ca1-0de736d1f291" containerName="ssh-known-hosts-edpm-deployment" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.746229 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.749027 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.749441 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.749791 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.750105 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.753172 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7"] Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.768226 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7psq\" (UniqueName: \"kubernetes.io/projected/13c93dcf-8343-45ef-a4cf-3f411d5311e1-kube-api-access-v7psq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.768574 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.768647 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.870487 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7psq\" (UniqueName: \"kubernetes.io/projected/13c93dcf-8343-45ef-a4cf-3f411d5311e1-kube-api-access-v7psq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.870571 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.870650 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.876717 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.878019 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:14 crc kubenswrapper[4744]: I0930 03:26:14.901789 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7psq\" (UniqueName: \"kubernetes.io/projected/13c93dcf-8343-45ef-a4cf-3f411d5311e1-kube-api-access-v7psq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-q66n7\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:15 crc kubenswrapper[4744]: I0930 03:26:15.064440 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:15 crc kubenswrapper[4744]: I0930 03:26:15.693255 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7"] Sep 30 03:26:16 crc kubenswrapper[4744]: I0930 03:26:16.701888 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" event={"ID":"13c93dcf-8343-45ef-a4cf-3f411d5311e1","Type":"ContainerStarted","Data":"96214def51eb4decff52e61d57c73aa0e6d7caa2ef4a6ef6f43ce720d28a80ce"} Sep 30 03:26:16 crc kubenswrapper[4744]: I0930 03:26:16.701927 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" event={"ID":"13c93dcf-8343-45ef-a4cf-3f411d5311e1","Type":"ContainerStarted","Data":"0e74499e192a607c9a30c81bffdd67b8ead2a86a6d8cabf2350f461b69a84ac4"} Sep 30 03:26:16 crc kubenswrapper[4744]: I0930 03:26:16.724579 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" podStartSLOduration=2.282562161 podStartE2EDuration="2.724564785s" podCreationTimestamp="2025-09-30 03:26:14 +0000 UTC" firstStartedPulling="2025-09-30 03:26:15.696066241 +0000 UTC m=+1902.869286215" lastFinishedPulling="2025-09-30 03:26:16.138068865 +0000 UTC m=+1903.311288839" observedRunningTime="2025-09-30 03:26:16.722526362 +0000 UTC m=+1903.895746356" watchObservedRunningTime="2025-09-30 03:26:16.724564785 +0000 UTC m=+1903.897784759" Sep 30 03:26:26 crc kubenswrapper[4744]: I0930 03:26:26.818409 4744 generic.go:334] "Generic (PLEG): container finished" podID="13c93dcf-8343-45ef-a4cf-3f411d5311e1" containerID="96214def51eb4decff52e61d57c73aa0e6d7caa2ef4a6ef6f43ce720d28a80ce" exitCode=0 Sep 30 03:26:26 crc kubenswrapper[4744]: I0930 03:26:26.818537 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" event={"ID":"13c93dcf-8343-45ef-a4cf-3f411d5311e1","Type":"ContainerDied","Data":"96214def51eb4decff52e61d57c73aa0e6d7caa2ef4a6ef6f43ce720d28a80ce"} Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.354900 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.399983 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7psq\" (UniqueName: \"kubernetes.io/projected/13c93dcf-8343-45ef-a4cf-3f411d5311e1-kube-api-access-v7psq\") pod \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.400169 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-ssh-key\") pod \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.400289 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-inventory\") pod \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\" (UID: \"13c93dcf-8343-45ef-a4cf-3f411d5311e1\") " Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.407045 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c93dcf-8343-45ef-a4cf-3f411d5311e1-kube-api-access-v7psq" (OuterVolumeSpecName: "kube-api-access-v7psq") pod "13c93dcf-8343-45ef-a4cf-3f411d5311e1" (UID: "13c93dcf-8343-45ef-a4cf-3f411d5311e1"). InnerVolumeSpecName "kube-api-access-v7psq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.434752 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13c93dcf-8343-45ef-a4cf-3f411d5311e1" (UID: "13c93dcf-8343-45ef-a4cf-3f411d5311e1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.436880 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-inventory" (OuterVolumeSpecName: "inventory") pod "13c93dcf-8343-45ef-a4cf-3f411d5311e1" (UID: "13c93dcf-8343-45ef-a4cf-3f411d5311e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.503186 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.503228 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13c93dcf-8343-45ef-a4cf-3f411d5311e1-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.503243 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7psq\" (UniqueName: \"kubernetes.io/projected/13c93dcf-8343-45ef-a4cf-3f411d5311e1-kube-api-access-v7psq\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.848344 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" event={"ID":"13c93dcf-8343-45ef-a4cf-3f411d5311e1","Type":"ContainerDied","Data":"0e74499e192a607c9a30c81bffdd67b8ead2a86a6d8cabf2350f461b69a84ac4"} Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.848416 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e74499e192a607c9a30c81bffdd67b8ead2a86a6d8cabf2350f461b69a84ac4" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.848490 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-q66n7" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.945676 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn"] Sep 30 03:26:28 crc kubenswrapper[4744]: E0930 03:26:28.946631 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c93dcf-8343-45ef-a4cf-3f411d5311e1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.946658 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c93dcf-8343-45ef-a4cf-3f411d5311e1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.946869 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c93dcf-8343-45ef-a4cf-3f411d5311e1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.947562 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.949645 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.949822 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.950148 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.954696 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:26:28 crc kubenswrapper[4744]: I0930 03:26:28.958204 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn"] Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.015710 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.015779 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.015807 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr6km\" (UniqueName: \"kubernetes.io/projected/3e4f8446-ac54-4cff-b7f3-025ced28cc74-kube-api-access-dr6km\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.117713 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.117763 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.117789 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr6km\" (UniqueName: \"kubernetes.io/projected/3e4f8446-ac54-4cff-b7f3-025ced28cc74-kube-api-access-dr6km\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.122318 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.129236 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.141709 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr6km\" (UniqueName: \"kubernetes.io/projected/3e4f8446-ac54-4cff-b7f3-025ced28cc74-kube-api-access-dr6km\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.270497 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:29 crc kubenswrapper[4744]: I0930 03:26:29.946079 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn"] Sep 30 03:26:30 crc kubenswrapper[4744]: I0930 03:26:30.873054 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" event={"ID":"3e4f8446-ac54-4cff-b7f3-025ced28cc74","Type":"ContainerStarted","Data":"d6e1c4a50163a854101cccd02a4bee3537d688b081c764960f9d463c57b0382e"} Sep 30 03:26:30 crc kubenswrapper[4744]: I0930 03:26:30.873613 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" event={"ID":"3e4f8446-ac54-4cff-b7f3-025ced28cc74","Type":"ContainerStarted","Data":"eea97d61549f946bb935b1a89dc75dd602e6b34f5fc4cd504167584f26346dd2"} Sep 30 03:26:30 crc kubenswrapper[4744]: I0930 03:26:30.894192 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" podStartSLOduration=2.384842265 podStartE2EDuration="2.89417591s" podCreationTimestamp="2025-09-30 03:26:28 +0000 UTC" firstStartedPulling="2025-09-30 03:26:29.96562716 +0000 UTC m=+1917.138847134" lastFinishedPulling="2025-09-30 03:26:30.474960765 +0000 UTC m=+1917.648180779" observedRunningTime="2025-09-30 03:26:30.891291421 +0000 UTC m=+1918.064511425" watchObservedRunningTime="2025-09-30 03:26:30.89417591 +0000 UTC m=+1918.067395884" Sep 30 03:26:41 crc kubenswrapper[4744]: I0930 03:26:41.018299 4744 generic.go:334] "Generic (PLEG): container finished" podID="3e4f8446-ac54-4cff-b7f3-025ced28cc74" containerID="d6e1c4a50163a854101cccd02a4bee3537d688b081c764960f9d463c57b0382e" exitCode=0 Sep 30 03:26:41 crc kubenswrapper[4744]: I0930 03:26:41.018947 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" event={"ID":"3e4f8446-ac54-4cff-b7f3-025ced28cc74","Type":"ContainerDied","Data":"d6e1c4a50163a854101cccd02a4bee3537d688b081c764960f9d463c57b0382e"} Sep 30 03:26:41 crc kubenswrapper[4744]: I0930 03:26:41.109507 4744 scope.go:117] "RemoveContainer" containerID="c092655f713849f89592150967af91bb962be8d9d5379b07efc46fb997db13a3" Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.516442 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.632793 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-inventory\") pod \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.633015 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr6km\" (UniqueName: \"kubernetes.io/projected/3e4f8446-ac54-4cff-b7f3-025ced28cc74-kube-api-access-dr6km\") pod \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.633065 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-ssh-key\") pod \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\" (UID: \"3e4f8446-ac54-4cff-b7f3-025ced28cc74\") " Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.640429 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4f8446-ac54-4cff-b7f3-025ced28cc74-kube-api-access-dr6km" (OuterVolumeSpecName: "kube-api-access-dr6km") pod "3e4f8446-ac54-4cff-b7f3-025ced28cc74" (UID: "3e4f8446-ac54-4cff-b7f3-025ced28cc74"). InnerVolumeSpecName "kube-api-access-dr6km". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.669048 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e4f8446-ac54-4cff-b7f3-025ced28cc74" (UID: "3e4f8446-ac54-4cff-b7f3-025ced28cc74"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.684275 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-inventory" (OuterVolumeSpecName: "inventory") pod "3e4f8446-ac54-4cff-b7f3-025ced28cc74" (UID: "3e4f8446-ac54-4cff-b7f3-025ced28cc74"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.737298 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr6km\" (UniqueName: \"kubernetes.io/projected/3e4f8446-ac54-4cff-b7f3-025ced28cc74-kube-api-access-dr6km\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.737351 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:42 crc kubenswrapper[4744]: I0930 03:26:42.737399 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e4f8446-ac54-4cff-b7f3-025ced28cc74-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.046853 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" event={"ID":"3e4f8446-ac54-4cff-b7f3-025ced28cc74","Type":"ContainerDied","Data":"eea97d61549f946bb935b1a89dc75dd602e6b34f5fc4cd504167584f26346dd2"} Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.046910 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eea97d61549f946bb935b1a89dc75dd602e6b34f5fc4cd504167584f26346dd2" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.047777 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.156462 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz"] Sep 30 03:26:43 crc kubenswrapper[4744]: E0930 03:26:43.157778 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4f8446-ac54-4cff-b7f3-025ced28cc74" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.157813 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4f8446-ac54-4cff-b7f3-025ced28cc74" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.158515 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4f8446-ac54-4cff-b7f3-025ced28cc74" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.162270 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.167414 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.168138 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.168467 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.169183 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.169523 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.170183 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.172483 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.174240 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz"] Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.184107 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247307 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247361 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247413 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247434 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247597 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247646 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lct6z\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-kube-api-access-lct6z\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247756 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247790 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247862 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247900 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.247992 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.248029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.248091 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.349786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.350035 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.350193 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.350314 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.350467 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.350649 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.350815 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.351042 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.351240 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.351447 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.351619 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.351797 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.351958 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.352120 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lct6z\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-kube-api-access-lct6z\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.354727 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.355017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.355639 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.355967 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.355988 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.358714 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.358882 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.359721 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.360272 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.360474 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.361110 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.361429 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.363830 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.377104 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lct6z\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-kube-api-access-lct6z\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4thcz\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.516482 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:26:43 crc kubenswrapper[4744]: I0930 03:26:43.925234 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz"] Sep 30 03:26:44 crc kubenswrapper[4744]: I0930 03:26:44.056006 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" event={"ID":"de456177-d85a-41d5-aa9f-f7d7d6f68e21","Type":"ContainerStarted","Data":"858dc773ec0b57d799742f559536c32fd8e94c9d247bee34da33156a1ed1ffb1"} Sep 30 03:26:45 crc kubenswrapper[4744]: I0930 03:26:45.070835 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" event={"ID":"de456177-d85a-41d5-aa9f-f7d7d6f68e21","Type":"ContainerStarted","Data":"4248f4588d16d7b49378cd5e964968f827f6fffe1009d6025bf768d4de8cedc7"} Sep 30 03:26:45 crc kubenswrapper[4744]: I0930 03:26:45.127916 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" podStartSLOduration=1.7051597410000001 podStartE2EDuration="2.127883206s" podCreationTimestamp="2025-09-30 03:26:43 +0000 UTC" firstStartedPulling="2025-09-30 03:26:43.943762241 +0000 UTC m=+1931.116982215" lastFinishedPulling="2025-09-30 03:26:44.366485706 +0000 UTC m=+1931.539705680" observedRunningTime="2025-09-30 03:26:45.110052743 +0000 UTC m=+1932.283272747" watchObservedRunningTime="2025-09-30 03:26:45.127883206 +0000 UTC m=+1932.301103220" Sep 30 03:27:04 crc kubenswrapper[4744]: I0930 03:27:04.347643 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:27:04 crc kubenswrapper[4744]: I0930 03:27:04.348496 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:27:26 crc kubenswrapper[4744]: I0930 03:27:26.547514 4744 generic.go:334] "Generic (PLEG): container finished" podID="de456177-d85a-41d5-aa9f-f7d7d6f68e21" containerID="4248f4588d16d7b49378cd5e964968f827f6fffe1009d6025bf768d4de8cedc7" exitCode=0 Sep 30 03:27:26 crc kubenswrapper[4744]: I0930 03:27:26.547592 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" event={"ID":"de456177-d85a-41d5-aa9f-f7d7d6f68e21","Type":"ContainerDied","Data":"4248f4588d16d7b49378cd5e964968f827f6fffe1009d6025bf768d4de8cedc7"} Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.034487 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126181 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126318 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ssh-key\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126367 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-bootstrap-combined-ca-bundle\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126440 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-repo-setup-combined-ca-bundle\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126479 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-ovn-default-certs-0\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126526 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-inventory\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126557 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-libvirt-combined-ca-bundle\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126632 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lct6z\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-kube-api-access-lct6z\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126684 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-nova-combined-ca-bundle\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126715 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126748 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-neutron-metadata-combined-ca-bundle\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126784 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.126916 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-telemetry-combined-ca-bundle\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.127040 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ovn-combined-ca-bundle\") pod \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\" (UID: \"de456177-d85a-41d5-aa9f-f7d7d6f68e21\") " Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.133190 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.135124 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.136148 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.136326 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.137232 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.140229 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.141658 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.142218 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.142267 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.142285 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.146055 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-kube-api-access-lct6z" (OuterVolumeSpecName: "kube-api-access-lct6z") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "kube-api-access-lct6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.147114 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.163591 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-inventory" (OuterVolumeSpecName: "inventory") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.171271 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "de456177-d85a-41d5-aa9f-f7d7d6f68e21" (UID: "de456177-d85a-41d5-aa9f-f7d7d6f68e21"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231305 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lct6z\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-kube-api-access-lct6z\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231346 4744 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231360 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231422 4744 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231440 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231454 4744 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231494 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231507 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231519 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231530 4744 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231543 4744 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231585 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/de456177-d85a-41d5-aa9f-f7d7d6f68e21-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231602 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.231618 4744 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de456177-d85a-41d5-aa9f-f7d7d6f68e21-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.575013 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" event={"ID":"de456177-d85a-41d5-aa9f-f7d7d6f68e21","Type":"ContainerDied","Data":"858dc773ec0b57d799742f559536c32fd8e94c9d247bee34da33156a1ed1ffb1"} Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.575063 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="858dc773ec0b57d799742f559536c32fd8e94c9d247bee34da33156a1ed1ffb1" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.575129 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4thcz" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.705537 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h"] Sep 30 03:27:28 crc kubenswrapper[4744]: E0930 03:27:28.706311 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de456177-d85a-41d5-aa9f-f7d7d6f68e21" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.706335 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="de456177-d85a-41d5-aa9f-f7d7d6f68e21" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.706642 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="de456177-d85a-41d5-aa9f-f7d7d6f68e21" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.707483 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.709540 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.709814 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.709843 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.709872 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.713019 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.727634 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h"] Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.846306 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmnk\" (UniqueName: \"kubernetes.io/projected/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-kube-api-access-7gmnk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.846397 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.846655 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.847014 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.847141 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.949271 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.949338 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.949423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmnk\" (UniqueName: \"kubernetes.io/projected/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-kube-api-access-7gmnk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.949477 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.949541 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.950595 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.954827 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.954853 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.958649 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:28 crc kubenswrapper[4744]: I0930 03:27:28.979886 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmnk\" (UniqueName: \"kubernetes.io/projected/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-kube-api-access-7gmnk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vtk6h\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:29 crc kubenswrapper[4744]: I0930 03:27:29.025625 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:27:29 crc kubenswrapper[4744]: I0930 03:27:29.367481 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h"] Sep 30 03:27:29 crc kubenswrapper[4744]: I0930 03:27:29.589180 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" event={"ID":"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225","Type":"ContainerStarted","Data":"1757cec6734c121c05ae677ab49aaeeae1889a29e8ecfd576dc2a53570fa506c"} Sep 30 03:27:30 crc kubenswrapper[4744]: I0930 03:27:30.606819 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" event={"ID":"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225","Type":"ContainerStarted","Data":"1e91362aac0fcead32925196a56b585a3cd82157ca9eecb0ff78719be233c657"} Sep 30 03:27:30 crc kubenswrapper[4744]: I0930 03:27:30.640744 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" podStartSLOduration=2.077544639 podStartE2EDuration="2.640717384s" podCreationTimestamp="2025-09-30 03:27:28 +0000 UTC" firstStartedPulling="2025-09-30 03:27:29.37160124 +0000 UTC m=+1976.544821214" lastFinishedPulling="2025-09-30 03:27:29.934773955 +0000 UTC m=+1977.107993959" observedRunningTime="2025-09-30 03:27:30.635422159 +0000 UTC m=+1977.808642173" watchObservedRunningTime="2025-09-30 03:27:30.640717384 +0000 UTC m=+1977.813937368" Sep 30 03:27:34 crc kubenswrapper[4744]: I0930 03:27:34.348018 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:27:34 crc kubenswrapper[4744]: I0930 03:27:34.348713 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:28:04 crc kubenswrapper[4744]: I0930 03:28:04.348005 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:28:04 crc kubenswrapper[4744]: I0930 03:28:04.349721 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:28:04 crc kubenswrapper[4744]: I0930 03:28:04.349783 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:28:04 crc kubenswrapper[4744]: I0930 03:28:04.350658 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe2e0c31b2f11e084705476ec0ebe78e94be3c2f8bdcb24273e81b7e0e5969e9"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:28:04 crc kubenswrapper[4744]: I0930 03:28:04.350718 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://fe2e0c31b2f11e084705476ec0ebe78e94be3c2f8bdcb24273e81b7e0e5969e9" gracePeriod=600 Sep 30 03:28:05 crc kubenswrapper[4744]: I0930 03:28:05.001096 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="fe2e0c31b2f11e084705476ec0ebe78e94be3c2f8bdcb24273e81b7e0e5969e9" exitCode=0 Sep 30 03:28:05 crc kubenswrapper[4744]: I0930 03:28:05.001145 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"fe2e0c31b2f11e084705476ec0ebe78e94be3c2f8bdcb24273e81b7e0e5969e9"} Sep 30 03:28:05 crc kubenswrapper[4744]: I0930 03:28:05.001855 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73"} Sep 30 03:28:05 crc kubenswrapper[4744]: I0930 03:28:05.001880 4744 scope.go:117] "RemoveContainer" containerID="5c6394e3e350b769839e4db83871371313066b963e215e67f2ee52e833adb31f" Sep 30 03:28:39 crc kubenswrapper[4744]: I0930 03:28:39.403633 4744 generic.go:334] "Generic (PLEG): container finished" podID="4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" containerID="1e91362aac0fcead32925196a56b585a3cd82157ca9eecb0ff78719be233c657" exitCode=0 Sep 30 03:28:39 crc kubenswrapper[4744]: I0930 03:28:39.403789 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" event={"ID":"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225","Type":"ContainerDied","Data":"1e91362aac0fcead32925196a56b585a3cd82157ca9eecb0ff78719be233c657"} Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.904583 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.986169 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmnk\" (UniqueName: \"kubernetes.io/projected/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-kube-api-access-7gmnk\") pod \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.986253 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovncontroller-config-0\") pod \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.986329 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ssh-key\") pod \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.986428 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-inventory\") pod \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.986507 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovn-combined-ca-bundle\") pod \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\" (UID: \"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225\") " Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.994860 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-kube-api-access-7gmnk" (OuterVolumeSpecName: "kube-api-access-7gmnk") pod "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" (UID: "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225"). InnerVolumeSpecName "kube-api-access-7gmnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:28:40 crc kubenswrapper[4744]: I0930 03:28:40.995645 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" (UID: "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.015918 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" (UID: "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.036299 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" (UID: "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.036468 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-inventory" (OuterVolumeSpecName: "inventory") pod "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" (UID: "4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.090452 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmnk\" (UniqueName: \"kubernetes.io/projected/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-kube-api-access-7gmnk\") on node \"crc\" DevicePath \"\"" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.090508 4744 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.090527 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.090548 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.090567 4744 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.428521 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" event={"ID":"4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225","Type":"ContainerDied","Data":"1757cec6734c121c05ae677ab49aaeeae1889a29e8ecfd576dc2a53570fa506c"} Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.428575 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1757cec6734c121c05ae677ab49aaeeae1889a29e8ecfd576dc2a53570fa506c" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.428617 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vtk6h" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.625030 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb"] Sep 30 03:28:41 crc kubenswrapper[4744]: E0930 03:28:41.625855 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.625878 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.626197 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.626978 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.629618 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.630023 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.630159 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.630296 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.630645 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.631448 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.649548 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb"] Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.706536 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.706658 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.706832 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgjzc\" (UniqueName: \"kubernetes.io/projected/ced06625-11b0-4e49-9874-9f627107037c-kube-api-access-cgjzc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.706978 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.707029 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.707108 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.809270 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.809327 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.809423 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgjzc\" (UniqueName: \"kubernetes.io/projected/ced06625-11b0-4e49-9874-9f627107037c-kube-api-access-cgjzc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.809509 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.809539 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.809560 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.814337 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.815443 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.815662 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.817670 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.826269 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.829011 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgjzc\" (UniqueName: \"kubernetes.io/projected/ced06625-11b0-4e49-9874-9f627107037c-kube-api-access-cgjzc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:41 crc kubenswrapper[4744]: I0930 03:28:41.963801 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:28:42 crc kubenswrapper[4744]: I0930 03:28:42.364883 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb"] Sep 30 03:28:42 crc kubenswrapper[4744]: W0930 03:28:42.371386 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podced06625_11b0_4e49_9874_9f627107037c.slice/crio-077bbffb505e4dad4d41596e2e751a0bef282f6e1c915db2cccaee65cc83b844 WatchSource:0}: Error finding container 077bbffb505e4dad4d41596e2e751a0bef282f6e1c915db2cccaee65cc83b844: Status 404 returned error can't find the container with id 077bbffb505e4dad4d41596e2e751a0bef282f6e1c915db2cccaee65cc83b844 Sep 30 03:28:42 crc kubenswrapper[4744]: I0930 03:28:42.437355 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" event={"ID":"ced06625-11b0-4e49-9874-9f627107037c","Type":"ContainerStarted","Data":"077bbffb505e4dad4d41596e2e751a0bef282f6e1c915db2cccaee65cc83b844"} Sep 30 03:28:43 crc kubenswrapper[4744]: I0930 03:28:43.453976 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" event={"ID":"ced06625-11b0-4e49-9874-9f627107037c","Type":"ContainerStarted","Data":"2016ae0d8d448d0af424c3da857051928dd6cadebb340228b05ce8dcb95fb47b"} Sep 30 03:28:43 crc kubenswrapper[4744]: I0930 03:28:43.493510 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" podStartSLOduration=1.999327773 podStartE2EDuration="2.493482705s" podCreationTimestamp="2025-09-30 03:28:41 +0000 UTC" firstStartedPulling="2025-09-30 03:28:42.374344598 +0000 UTC m=+2049.547564602" lastFinishedPulling="2025-09-30 03:28:42.86849953 +0000 UTC m=+2050.041719534" observedRunningTime="2025-09-30 03:28:43.475516677 +0000 UTC m=+2050.648736681" watchObservedRunningTime="2025-09-30 03:28:43.493482705 +0000 UTC m=+2050.666702709" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.264888 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dhg8d"] Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.268844 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.288976 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dhg8d"] Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.379293 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-utilities\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.379585 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-catalog-content\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.379907 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsr8v\" (UniqueName: \"kubernetes.io/projected/dad12778-5d1f-4d58-967d-9b5af5e77ff4-kube-api-access-bsr8v\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.482122 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsr8v\" (UniqueName: \"kubernetes.io/projected/dad12778-5d1f-4d58-967d-9b5af5e77ff4-kube-api-access-bsr8v\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.482308 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-utilities\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.482484 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-catalog-content\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.482855 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-utilities\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.483082 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-catalog-content\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.515704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsr8v\" (UniqueName: \"kubernetes.io/projected/dad12778-5d1f-4d58-967d-9b5af5e77ff4-kube-api-access-bsr8v\") pod \"redhat-operators-dhg8d\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:48 crc kubenswrapper[4744]: I0930 03:28:48.609484 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:49 crc kubenswrapper[4744]: I0930 03:28:49.117949 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dhg8d"] Sep 30 03:28:49 crc kubenswrapper[4744]: I0930 03:28:49.519045 4744 generic.go:334] "Generic (PLEG): container finished" podID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerID="09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec" exitCode=0 Sep 30 03:28:49 crc kubenswrapper[4744]: I0930 03:28:49.519103 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhg8d" event={"ID":"dad12778-5d1f-4d58-967d-9b5af5e77ff4","Type":"ContainerDied","Data":"09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec"} Sep 30 03:28:49 crc kubenswrapper[4744]: I0930 03:28:49.519308 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhg8d" event={"ID":"dad12778-5d1f-4d58-967d-9b5af5e77ff4","Type":"ContainerStarted","Data":"239786fda09071cd46f067aa6d6bd8ee8da088fabde155bb677a9f66f6fab564"} Sep 30 03:28:50 crc kubenswrapper[4744]: I0930 03:28:50.535392 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhg8d" event={"ID":"dad12778-5d1f-4d58-967d-9b5af5e77ff4","Type":"ContainerStarted","Data":"19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f"} Sep 30 03:28:51 crc kubenswrapper[4744]: I0930 03:28:51.550754 4744 generic.go:334] "Generic (PLEG): container finished" podID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerID="19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f" exitCode=0 Sep 30 03:28:51 crc kubenswrapper[4744]: I0930 03:28:51.550814 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhg8d" event={"ID":"dad12778-5d1f-4d58-967d-9b5af5e77ff4","Type":"ContainerDied","Data":"19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f"} Sep 30 03:28:52 crc kubenswrapper[4744]: I0930 03:28:52.566316 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhg8d" event={"ID":"dad12778-5d1f-4d58-967d-9b5af5e77ff4","Type":"ContainerStarted","Data":"b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38"} Sep 30 03:28:52 crc kubenswrapper[4744]: I0930 03:28:52.591202 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dhg8d" podStartSLOduration=2.080263104 podStartE2EDuration="4.591177395s" podCreationTimestamp="2025-09-30 03:28:48 +0000 UTC" firstStartedPulling="2025-09-30 03:28:49.52098086 +0000 UTC m=+2056.694200834" lastFinishedPulling="2025-09-30 03:28:52.031895121 +0000 UTC m=+2059.205115125" observedRunningTime="2025-09-30 03:28:52.589954507 +0000 UTC m=+2059.763174491" watchObservedRunningTime="2025-09-30 03:28:52.591177395 +0000 UTC m=+2059.764397399" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.609811 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4x8m4"] Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.612325 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.657469 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4x8m4"] Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.755961 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-catalog-content\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.756048 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-utilities\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.756285 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhtld\" (UniqueName: \"kubernetes.io/projected/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-kube-api-access-jhtld\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.858095 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhtld\" (UniqueName: \"kubernetes.io/projected/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-kube-api-access-jhtld\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.858202 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-catalog-content\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.858225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-utilities\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.858657 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-utilities\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.858825 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-catalog-content\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.890902 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhtld\" (UniqueName: \"kubernetes.io/projected/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-kube-api-access-jhtld\") pod \"certified-operators-4x8m4\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:55 crc kubenswrapper[4744]: I0930 03:28:55.933872 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:28:56 crc kubenswrapper[4744]: I0930 03:28:56.421258 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4x8m4"] Sep 30 03:28:56 crc kubenswrapper[4744]: I0930 03:28:56.604299 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4x8m4" event={"ID":"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49","Type":"ContainerStarted","Data":"7312be7cdd605e3db1034c54d5a2d51cd6714bbf6c0629bc53a30795e3b8a8bd"} Sep 30 03:28:57 crc kubenswrapper[4744]: I0930 03:28:57.617741 4744 generic.go:334] "Generic (PLEG): container finished" podID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerID="1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7" exitCode=0 Sep 30 03:28:57 crc kubenswrapper[4744]: I0930 03:28:57.617820 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4x8m4" event={"ID":"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49","Type":"ContainerDied","Data":"1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7"} Sep 30 03:28:58 crc kubenswrapper[4744]: I0930 03:28:58.609653 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:58 crc kubenswrapper[4744]: I0930 03:28:58.610049 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:58 crc kubenswrapper[4744]: I0930 03:28:58.634187 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4x8m4" event={"ID":"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49","Type":"ContainerStarted","Data":"61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a"} Sep 30 03:28:58 crc kubenswrapper[4744]: I0930 03:28:58.688015 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:58 crc kubenswrapper[4744]: I0930 03:28:58.749951 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:28:59 crc kubenswrapper[4744]: I0930 03:28:59.648893 4744 generic.go:334] "Generic (PLEG): container finished" podID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerID="61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a" exitCode=0 Sep 30 03:28:59 crc kubenswrapper[4744]: I0930 03:28:59.649075 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4x8m4" event={"ID":"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49","Type":"ContainerDied","Data":"61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a"} Sep 30 03:29:00 crc kubenswrapper[4744]: I0930 03:29:00.666508 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4x8m4" event={"ID":"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49","Type":"ContainerStarted","Data":"e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f"} Sep 30 03:29:00 crc kubenswrapper[4744]: I0930 03:29:00.702599 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4x8m4" podStartSLOduration=3.307180029 podStartE2EDuration="5.702572482s" podCreationTimestamp="2025-09-30 03:28:55 +0000 UTC" firstStartedPulling="2025-09-30 03:28:57.620357314 +0000 UTC m=+2064.793577288" lastFinishedPulling="2025-09-30 03:29:00.015749727 +0000 UTC m=+2067.188969741" observedRunningTime="2025-09-30 03:29:00.691991103 +0000 UTC m=+2067.865211117" watchObservedRunningTime="2025-09-30 03:29:00.702572482 +0000 UTC m=+2067.875792496" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.012413 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dhg8d"] Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.013059 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dhg8d" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="registry-server" containerID="cri-o://b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38" gracePeriod=2 Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.467504 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.594852 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsr8v\" (UniqueName: \"kubernetes.io/projected/dad12778-5d1f-4d58-967d-9b5af5e77ff4-kube-api-access-bsr8v\") pod \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.596873 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-catalog-content\") pod \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.597634 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-utilities\") pod \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\" (UID: \"dad12778-5d1f-4d58-967d-9b5af5e77ff4\") " Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.602284 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dad12778-5d1f-4d58-967d-9b5af5e77ff4-kube-api-access-bsr8v" (OuterVolumeSpecName: "kube-api-access-bsr8v") pod "dad12778-5d1f-4d58-967d-9b5af5e77ff4" (UID: "dad12778-5d1f-4d58-967d-9b5af5e77ff4"). InnerVolumeSpecName "kube-api-access-bsr8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.603143 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-utilities" (OuterVolumeSpecName: "utilities") pod "dad12778-5d1f-4d58-967d-9b5af5e77ff4" (UID: "dad12778-5d1f-4d58-967d-9b5af5e77ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.679153 4744 generic.go:334] "Generic (PLEG): container finished" podID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerID="b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38" exitCode=0 Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.679227 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhg8d" event={"ID":"dad12778-5d1f-4d58-967d-9b5af5e77ff4","Type":"ContainerDied","Data":"b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38"} Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.679289 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dhg8d" event={"ID":"dad12778-5d1f-4d58-967d-9b5af5e77ff4","Type":"ContainerDied","Data":"239786fda09071cd46f067aa6d6bd8ee8da088fabde155bb677a9f66f6fab564"} Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.679312 4744 scope.go:117] "RemoveContainer" containerID="b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.681672 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dhg8d" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.692578 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dad12778-5d1f-4d58-967d-9b5af5e77ff4" (UID: "dad12778-5d1f-4d58-967d-9b5af5e77ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.696395 4744 scope.go:117] "RemoveContainer" containerID="19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.699830 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsr8v\" (UniqueName: \"kubernetes.io/projected/dad12778-5d1f-4d58-967d-9b5af5e77ff4-kube-api-access-bsr8v\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.699855 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.699868 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad12778-5d1f-4d58-967d-9b5af5e77ff4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.718655 4744 scope.go:117] "RemoveContainer" containerID="09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.779464 4744 scope.go:117] "RemoveContainer" containerID="b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38" Sep 30 03:29:01 crc kubenswrapper[4744]: E0930 03:29:01.779910 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38\": container with ID starting with b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38 not found: ID does not exist" containerID="b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.779937 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38"} err="failed to get container status \"b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38\": rpc error: code = NotFound desc = could not find container \"b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38\": container with ID starting with b077484bf9caa24e26ab9326b9627b574c05d5b178b95788005b875ada105b38 not found: ID does not exist" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.779959 4744 scope.go:117] "RemoveContainer" containerID="19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f" Sep 30 03:29:01 crc kubenswrapper[4744]: E0930 03:29:01.780267 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f\": container with ID starting with 19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f not found: ID does not exist" containerID="19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.780321 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f"} err="failed to get container status \"19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f\": rpc error: code = NotFound desc = could not find container \"19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f\": container with ID starting with 19e52f39bcab71b83daf64d593d5b44c98a5996898e40f92acd259ed030e022f not found: ID does not exist" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.780356 4744 scope.go:117] "RemoveContainer" containerID="09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec" Sep 30 03:29:01 crc kubenswrapper[4744]: E0930 03:29:01.780788 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec\": container with ID starting with 09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec not found: ID does not exist" containerID="09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec" Sep 30 03:29:01 crc kubenswrapper[4744]: I0930 03:29:01.780811 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec"} err="failed to get container status \"09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec\": rpc error: code = NotFound desc = could not find container \"09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec\": container with ID starting with 09651cc795cbcc5447db53491912e2b2b8032c396552af8638a174e5d8c865ec not found: ID does not exist" Sep 30 03:29:02 crc kubenswrapper[4744]: I0930 03:29:02.031997 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dhg8d"] Sep 30 03:29:02 crc kubenswrapper[4744]: I0930 03:29:02.041922 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dhg8d"] Sep 30 03:29:03 crc kubenswrapper[4744]: I0930 03:29:03.539735 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" path="/var/lib/kubelet/pods/dad12778-5d1f-4d58-967d-9b5af5e77ff4/volumes" Sep 30 03:29:05 crc kubenswrapper[4744]: I0930 03:29:05.934820 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:29:05 crc kubenswrapper[4744]: I0930 03:29:05.935250 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:29:06 crc kubenswrapper[4744]: I0930 03:29:06.000178 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:29:06 crc kubenswrapper[4744]: I0930 03:29:06.794127 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:29:07 crc kubenswrapper[4744]: I0930 03:29:07.006741 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4x8m4"] Sep 30 03:29:08 crc kubenswrapper[4744]: I0930 03:29:08.752912 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4x8m4" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="registry-server" containerID="cri-o://e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f" gracePeriod=2 Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.271110 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.391164 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhtld\" (UniqueName: \"kubernetes.io/projected/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-kube-api-access-jhtld\") pod \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.391741 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-utilities\") pod \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.391992 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-catalog-content\") pod \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\" (UID: \"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49\") " Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.392591 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-utilities" (OuterVolumeSpecName: "utilities") pod "6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" (UID: "6dcb0b69-8fc1-4c88-be58-59c1c17a4e49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.402576 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-kube-api-access-jhtld" (OuterVolumeSpecName: "kube-api-access-jhtld") pod "6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" (UID: "6dcb0b69-8fc1-4c88-be58-59c1c17a4e49"). InnerVolumeSpecName "kube-api-access-jhtld". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.461205 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" (UID: "6dcb0b69-8fc1-4c88-be58-59c1c17a4e49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.494759 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.494802 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhtld\" (UniqueName: \"kubernetes.io/projected/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-kube-api-access-jhtld\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.494818 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.768643 4744 generic.go:334] "Generic (PLEG): container finished" podID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerID="e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f" exitCode=0 Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.768690 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4x8m4" event={"ID":"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49","Type":"ContainerDied","Data":"e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f"} Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.768722 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4x8m4" event={"ID":"6dcb0b69-8fc1-4c88-be58-59c1c17a4e49","Type":"ContainerDied","Data":"7312be7cdd605e3db1034c54d5a2d51cd6714bbf6c0629bc53a30795e3b8a8bd"} Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.768743 4744 scope.go:117] "RemoveContainer" containerID="e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.768795 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4x8m4" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.812109 4744 scope.go:117] "RemoveContainer" containerID="61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.816208 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4x8m4"] Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.826339 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4x8m4"] Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.859908 4744 scope.go:117] "RemoveContainer" containerID="1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.917732 4744 scope.go:117] "RemoveContainer" containerID="e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f" Sep 30 03:29:09 crc kubenswrapper[4744]: E0930 03:29:09.918328 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f\": container with ID starting with e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f not found: ID does not exist" containerID="e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.918405 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f"} err="failed to get container status \"e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f\": rpc error: code = NotFound desc = could not find container \"e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f\": container with ID starting with e9736530ced3a1c5e94548843e46116c81683396a3fa537ab14da4e0f7d4473f not found: ID does not exist" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.918442 4744 scope.go:117] "RemoveContainer" containerID="61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a" Sep 30 03:29:09 crc kubenswrapper[4744]: E0930 03:29:09.918883 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a\": container with ID starting with 61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a not found: ID does not exist" containerID="61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.918947 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a"} err="failed to get container status \"61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a\": rpc error: code = NotFound desc = could not find container \"61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a\": container with ID starting with 61b198c0e5f22f9291f017d2936e89fba4f31103bbb1df1b1d2ea12431a8b64a not found: ID does not exist" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.918980 4744 scope.go:117] "RemoveContainer" containerID="1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7" Sep 30 03:29:09 crc kubenswrapper[4744]: E0930 03:29:09.919359 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7\": container with ID starting with 1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7 not found: ID does not exist" containerID="1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7" Sep 30 03:29:09 crc kubenswrapper[4744]: I0930 03:29:09.919434 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7"} err="failed to get container status \"1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7\": rpc error: code = NotFound desc = could not find container \"1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7\": container with ID starting with 1f7f961cbbdaefa61d918c4c19e0714c355aab5143508917061b05434add38b7 not found: ID does not exist" Sep 30 03:29:11 crc kubenswrapper[4744]: I0930 03:29:11.521982 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" path="/var/lib/kubelet/pods/6dcb0b69-8fc1-4c88-be58-59c1c17a4e49/volumes" Sep 30 03:29:37 crc kubenswrapper[4744]: I0930 03:29:37.174757 4744 generic.go:334] "Generic (PLEG): container finished" podID="ced06625-11b0-4e49-9874-9f627107037c" containerID="2016ae0d8d448d0af424c3da857051928dd6cadebb340228b05ce8dcb95fb47b" exitCode=0 Sep 30 03:29:37 crc kubenswrapper[4744]: I0930 03:29:37.175059 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" event={"ID":"ced06625-11b0-4e49-9874-9f627107037c","Type":"ContainerDied","Data":"2016ae0d8d448d0af424c3da857051928dd6cadebb340228b05ce8dcb95fb47b"} Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.714016 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.805693 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-inventory\") pod \"ced06625-11b0-4e49-9874-9f627107037c\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.805741 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-metadata-combined-ca-bundle\") pod \"ced06625-11b0-4e49-9874-9f627107037c\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.805956 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ced06625-11b0-4e49-9874-9f627107037c\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.806028 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgjzc\" (UniqueName: \"kubernetes.io/projected/ced06625-11b0-4e49-9874-9f627107037c-kube-api-access-cgjzc\") pod \"ced06625-11b0-4e49-9874-9f627107037c\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.806116 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-nova-metadata-neutron-config-0\") pod \"ced06625-11b0-4e49-9874-9f627107037c\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.806137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-ssh-key\") pod \"ced06625-11b0-4e49-9874-9f627107037c\" (UID: \"ced06625-11b0-4e49-9874-9f627107037c\") " Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.811483 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced06625-11b0-4e49-9874-9f627107037c-kube-api-access-cgjzc" (OuterVolumeSpecName: "kube-api-access-cgjzc") pod "ced06625-11b0-4e49-9874-9f627107037c" (UID: "ced06625-11b0-4e49-9874-9f627107037c"). InnerVolumeSpecName "kube-api-access-cgjzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.812765 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ced06625-11b0-4e49-9874-9f627107037c" (UID: "ced06625-11b0-4e49-9874-9f627107037c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.843681 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ced06625-11b0-4e49-9874-9f627107037c" (UID: "ced06625-11b0-4e49-9874-9f627107037c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.846825 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-inventory" (OuterVolumeSpecName: "inventory") pod "ced06625-11b0-4e49-9874-9f627107037c" (UID: "ced06625-11b0-4e49-9874-9f627107037c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.847226 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ced06625-11b0-4e49-9874-9f627107037c" (UID: "ced06625-11b0-4e49-9874-9f627107037c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.852234 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ced06625-11b0-4e49-9874-9f627107037c" (UID: "ced06625-11b0-4e49-9874-9f627107037c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.909288 4744 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.909343 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgjzc\" (UniqueName: \"kubernetes.io/projected/ced06625-11b0-4e49-9874-9f627107037c-kube-api-access-cgjzc\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.909365 4744 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.909408 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.909430 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:38 crc kubenswrapper[4744]: I0930 03:29:38.909448 4744 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced06625-11b0-4e49-9874-9f627107037c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.207897 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" event={"ID":"ced06625-11b0-4e49-9874-9f627107037c","Type":"ContainerDied","Data":"077bbffb505e4dad4d41596e2e751a0bef282f6e1c915db2cccaee65cc83b844"} Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.208265 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="077bbffb505e4dad4d41596e2e751a0bef282f6e1c915db2cccaee65cc83b844" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.207975 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325458 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh"] Sep 30 03:29:39 crc kubenswrapper[4744]: E0930 03:29:39.325837 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced06625-11b0-4e49-9874-9f627107037c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325859 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced06625-11b0-4e49-9874-9f627107037c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 03:29:39 crc kubenswrapper[4744]: E0930 03:29:39.325874 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="extract-utilities" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325880 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="extract-utilities" Sep 30 03:29:39 crc kubenswrapper[4744]: E0930 03:29:39.325893 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="extract-content" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325899 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="extract-content" Sep 30 03:29:39 crc kubenswrapper[4744]: E0930 03:29:39.325923 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="extract-utilities" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325929 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="extract-utilities" Sep 30 03:29:39 crc kubenswrapper[4744]: E0930 03:29:39.325939 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="registry-server" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325946 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="registry-server" Sep 30 03:29:39 crc kubenswrapper[4744]: E0930 03:29:39.325963 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="registry-server" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325970 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="registry-server" Sep 30 03:29:39 crc kubenswrapper[4744]: E0930 03:29:39.325983 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="extract-content" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.325991 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="extract-content" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.326215 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad12778-5d1f-4d58-967d-9b5af5e77ff4" containerName="registry-server" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.326233 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dcb0b69-8fc1-4c88-be58-59c1c17a4e49" containerName="registry-server" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.326248 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced06625-11b0-4e49-9874-9f627107037c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.326862 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.329147 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.329560 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.329975 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.330237 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.330591 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.348183 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh"] Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.422804 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.422847 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.422953 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.422979 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz7sk\" (UniqueName: \"kubernetes.io/projected/fc1867d3-bb6f-4fca-876d-b868bcd284bb-kube-api-access-gz7sk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.423019 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.524756 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.524887 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz7sk\" (UniqueName: \"kubernetes.io/projected/fc1867d3-bb6f-4fca-876d-b868bcd284bb-kube-api-access-gz7sk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.524985 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.525180 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.525279 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.532161 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.532688 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.533128 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.533521 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.547287 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz7sk\" (UniqueName: \"kubernetes.io/projected/fc1867d3-bb6f-4fca-876d-b868bcd284bb-kube-api-access-gz7sk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:39 crc kubenswrapper[4744]: I0930 03:29:39.671124 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:29:40 crc kubenswrapper[4744]: I0930 03:29:40.089743 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh"] Sep 30 03:29:40 crc kubenswrapper[4744]: I0930 03:29:40.217942 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" event={"ID":"fc1867d3-bb6f-4fca-876d-b868bcd284bb","Type":"ContainerStarted","Data":"2068a3c5550af8b923461be774f2f9ebad5f2d03c941bd6343cff84407f7e92d"} Sep 30 03:29:41 crc kubenswrapper[4744]: I0930 03:29:41.232590 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" event={"ID":"fc1867d3-bb6f-4fca-876d-b868bcd284bb","Type":"ContainerStarted","Data":"dd8f67efeaa3771e0b53cbdf32ae97dfa2d0e20f6763bf70fe67b79979edf6f1"} Sep 30 03:29:41 crc kubenswrapper[4744]: I0930 03:29:41.266240 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" podStartSLOduration=1.861363654 podStartE2EDuration="2.266211874s" podCreationTimestamp="2025-09-30 03:29:39 +0000 UTC" firstStartedPulling="2025-09-30 03:29:40.097583099 +0000 UTC m=+2107.270803073" lastFinishedPulling="2025-09-30 03:29:40.502431299 +0000 UTC m=+2107.675651293" observedRunningTime="2025-09-30 03:29:41.256681177 +0000 UTC m=+2108.429901191" watchObservedRunningTime="2025-09-30 03:29:41.266211874 +0000 UTC m=+2108.439431878" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.179424 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl"] Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.181950 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.186078 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl"] Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.186164 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.190517 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.216053 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30471e6c-5068-46d1-a5b7-48d29949d489-config-volume\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.216218 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhtpk\" (UniqueName: \"kubernetes.io/projected/30471e6c-5068-46d1-a5b7-48d29949d489-kube-api-access-xhtpk\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.216532 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30471e6c-5068-46d1-a5b7-48d29949d489-secret-volume\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.319826 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30471e6c-5068-46d1-a5b7-48d29949d489-secret-volume\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.320221 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30471e6c-5068-46d1-a5b7-48d29949d489-config-volume\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.320529 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhtpk\" (UniqueName: \"kubernetes.io/projected/30471e6c-5068-46d1-a5b7-48d29949d489-kube-api-access-xhtpk\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.322268 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30471e6c-5068-46d1-a5b7-48d29949d489-config-volume\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.331111 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30471e6c-5068-46d1-a5b7-48d29949d489-secret-volume\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.352041 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhtpk\" (UniqueName: \"kubernetes.io/projected/30471e6c-5068-46d1-a5b7-48d29949d489-kube-api-access-xhtpk\") pod \"collect-profiles-29320050-4d6nl\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:00 crc kubenswrapper[4744]: I0930 03:30:00.506626 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:01 crc kubenswrapper[4744]: I0930 03:30:01.021999 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl"] Sep 30 03:30:01 crc kubenswrapper[4744]: W0930 03:30:01.032774 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30471e6c_5068_46d1_a5b7_48d29949d489.slice/crio-62d3f8dc23b18fb32c64b7e07752181c92c603a92b476fc79f445f52fe7fcf5f WatchSource:0}: Error finding container 62d3f8dc23b18fb32c64b7e07752181c92c603a92b476fc79f445f52fe7fcf5f: Status 404 returned error can't find the container with id 62d3f8dc23b18fb32c64b7e07752181c92c603a92b476fc79f445f52fe7fcf5f Sep 30 03:30:01 crc kubenswrapper[4744]: I0930 03:30:01.492193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" event={"ID":"30471e6c-5068-46d1-a5b7-48d29949d489","Type":"ContainerStarted","Data":"690c653dad5d3629c41f033b96a26b8e3bb1386d9c9b300428f4069f0b102a5a"} Sep 30 03:30:01 crc kubenswrapper[4744]: I0930 03:30:01.492577 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" event={"ID":"30471e6c-5068-46d1-a5b7-48d29949d489","Type":"ContainerStarted","Data":"62d3f8dc23b18fb32c64b7e07752181c92c603a92b476fc79f445f52fe7fcf5f"} Sep 30 03:30:01 crc kubenswrapper[4744]: I0930 03:30:01.521239 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" podStartSLOduration=1.5212143089999999 podStartE2EDuration="1.521214309s" podCreationTimestamp="2025-09-30 03:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 03:30:01.508179025 +0000 UTC m=+2128.681399039" watchObservedRunningTime="2025-09-30 03:30:01.521214309 +0000 UTC m=+2128.694434293" Sep 30 03:30:02 crc kubenswrapper[4744]: I0930 03:30:02.507915 4744 generic.go:334] "Generic (PLEG): container finished" podID="30471e6c-5068-46d1-a5b7-48d29949d489" containerID="690c653dad5d3629c41f033b96a26b8e3bb1386d9c9b300428f4069f0b102a5a" exitCode=0 Sep 30 03:30:02 crc kubenswrapper[4744]: I0930 03:30:02.507980 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" event={"ID":"30471e6c-5068-46d1-a5b7-48d29949d489","Type":"ContainerDied","Data":"690c653dad5d3629c41f033b96a26b8e3bb1386d9c9b300428f4069f0b102a5a"} Sep 30 03:30:03 crc kubenswrapper[4744]: I0930 03:30:03.895645 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:03 crc kubenswrapper[4744]: I0930 03:30:03.900617 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhtpk\" (UniqueName: \"kubernetes.io/projected/30471e6c-5068-46d1-a5b7-48d29949d489-kube-api-access-xhtpk\") pod \"30471e6c-5068-46d1-a5b7-48d29949d489\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " Sep 30 03:30:03 crc kubenswrapper[4744]: I0930 03:30:03.900712 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30471e6c-5068-46d1-a5b7-48d29949d489-secret-volume\") pod \"30471e6c-5068-46d1-a5b7-48d29949d489\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " Sep 30 03:30:03 crc kubenswrapper[4744]: I0930 03:30:03.900761 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30471e6c-5068-46d1-a5b7-48d29949d489-config-volume\") pod \"30471e6c-5068-46d1-a5b7-48d29949d489\" (UID: \"30471e6c-5068-46d1-a5b7-48d29949d489\") " Sep 30 03:30:03 crc kubenswrapper[4744]: I0930 03:30:03.901690 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30471e6c-5068-46d1-a5b7-48d29949d489-config-volume" (OuterVolumeSpecName: "config-volume") pod "30471e6c-5068-46d1-a5b7-48d29949d489" (UID: "30471e6c-5068-46d1-a5b7-48d29949d489"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:30:03 crc kubenswrapper[4744]: I0930 03:30:03.909237 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30471e6c-5068-46d1-a5b7-48d29949d489-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "30471e6c-5068-46d1-a5b7-48d29949d489" (UID: "30471e6c-5068-46d1-a5b7-48d29949d489"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:30:03 crc kubenswrapper[4744]: I0930 03:30:03.912348 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30471e6c-5068-46d1-a5b7-48d29949d489-kube-api-access-xhtpk" (OuterVolumeSpecName: "kube-api-access-xhtpk") pod "30471e6c-5068-46d1-a5b7-48d29949d489" (UID: "30471e6c-5068-46d1-a5b7-48d29949d489"). InnerVolumeSpecName "kube-api-access-xhtpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.002651 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30471e6c-5068-46d1-a5b7-48d29949d489-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.002683 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhtpk\" (UniqueName: \"kubernetes.io/projected/30471e6c-5068-46d1-a5b7-48d29949d489-kube-api-access-xhtpk\") on node \"crc\" DevicePath \"\"" Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.002695 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30471e6c-5068-46d1-a5b7-48d29949d489-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.348008 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.348082 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.540766 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" event={"ID":"30471e6c-5068-46d1-a5b7-48d29949d489","Type":"ContainerDied","Data":"62d3f8dc23b18fb32c64b7e07752181c92c603a92b476fc79f445f52fe7fcf5f"} Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.540833 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62d3f8dc23b18fb32c64b7e07752181c92c603a92b476fc79f445f52fe7fcf5f" Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.540782 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl" Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.608272 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx"] Sep 30 03:30:04 crc kubenswrapper[4744]: I0930 03:30:04.618150 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320005-gxcgx"] Sep 30 03:30:05 crc kubenswrapper[4744]: I0930 03:30:05.527658 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2676764-efb6-4e02-9012-74b8675e7bff" path="/var/lib/kubelet/pods/b2676764-efb6-4e02-9012-74b8675e7bff/volumes" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.276483 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25dtx"] Sep 30 03:30:09 crc kubenswrapper[4744]: E0930 03:30:09.277748 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30471e6c-5068-46d1-a5b7-48d29949d489" containerName="collect-profiles" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.277769 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="30471e6c-5068-46d1-a5b7-48d29949d489" containerName="collect-profiles" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.278138 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="30471e6c-5068-46d1-a5b7-48d29949d489" containerName="collect-profiles" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.280614 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.292510 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25dtx"] Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.321561 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds5qn\" (UniqueName: \"kubernetes.io/projected/627efd38-311b-4fc4-a852-82e86a1be04a-kube-api-access-ds5qn\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.321746 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-utilities\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.321769 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-catalog-content\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.424675 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-utilities\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.424761 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-catalog-content\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.424879 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds5qn\" (UniqueName: \"kubernetes.io/projected/627efd38-311b-4fc4-a852-82e86a1be04a-kube-api-access-ds5qn\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.425208 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-utilities\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.425335 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-catalog-content\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.447387 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds5qn\" (UniqueName: \"kubernetes.io/projected/627efd38-311b-4fc4-a852-82e86a1be04a-kube-api-access-ds5qn\") pod \"community-operators-25dtx\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:09 crc kubenswrapper[4744]: I0930 03:30:09.623120 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:10 crc kubenswrapper[4744]: I0930 03:30:10.163982 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25dtx"] Sep 30 03:30:10 crc kubenswrapper[4744]: W0930 03:30:10.168858 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod627efd38_311b_4fc4_a852_82e86a1be04a.slice/crio-6bc678ef1dd15e95fb7bb02233b382e8b82986b0857b9cbc98a2e8919d2958da WatchSource:0}: Error finding container 6bc678ef1dd15e95fb7bb02233b382e8b82986b0857b9cbc98a2e8919d2958da: Status 404 returned error can't find the container with id 6bc678ef1dd15e95fb7bb02233b382e8b82986b0857b9cbc98a2e8919d2958da Sep 30 03:30:10 crc kubenswrapper[4744]: I0930 03:30:10.615010 4744 generic.go:334] "Generic (PLEG): container finished" podID="627efd38-311b-4fc4-a852-82e86a1be04a" containerID="2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8" exitCode=0 Sep 30 03:30:10 crc kubenswrapper[4744]: I0930 03:30:10.615077 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25dtx" event={"ID":"627efd38-311b-4fc4-a852-82e86a1be04a","Type":"ContainerDied","Data":"2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8"} Sep 30 03:30:10 crc kubenswrapper[4744]: I0930 03:30:10.615118 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25dtx" event={"ID":"627efd38-311b-4fc4-a852-82e86a1be04a","Type":"ContainerStarted","Data":"6bc678ef1dd15e95fb7bb02233b382e8b82986b0857b9cbc98a2e8919d2958da"} Sep 30 03:30:12 crc kubenswrapper[4744]: I0930 03:30:12.663870 4744 generic.go:334] "Generic (PLEG): container finished" podID="627efd38-311b-4fc4-a852-82e86a1be04a" containerID="a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84" exitCode=0 Sep 30 03:30:12 crc kubenswrapper[4744]: I0930 03:30:12.663946 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25dtx" event={"ID":"627efd38-311b-4fc4-a852-82e86a1be04a","Type":"ContainerDied","Data":"a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84"} Sep 30 03:30:13 crc kubenswrapper[4744]: I0930 03:30:13.687000 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25dtx" event={"ID":"627efd38-311b-4fc4-a852-82e86a1be04a","Type":"ContainerStarted","Data":"ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098"} Sep 30 03:30:13 crc kubenswrapper[4744]: I0930 03:30:13.722887 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25dtx" podStartSLOduration=2.123933839 podStartE2EDuration="4.722858533s" podCreationTimestamp="2025-09-30 03:30:09 +0000 UTC" firstStartedPulling="2025-09-30 03:30:10.619042864 +0000 UTC m=+2137.792262848" lastFinishedPulling="2025-09-30 03:30:13.217967538 +0000 UTC m=+2140.391187542" observedRunningTime="2025-09-30 03:30:13.711940245 +0000 UTC m=+2140.885160239" watchObservedRunningTime="2025-09-30 03:30:13.722858533 +0000 UTC m=+2140.896078547" Sep 30 03:30:19 crc kubenswrapper[4744]: I0930 03:30:19.624003 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:19 crc kubenswrapper[4744]: I0930 03:30:19.624621 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:19 crc kubenswrapper[4744]: I0930 03:30:19.691080 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:19 crc kubenswrapper[4744]: I0930 03:30:19.832088 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:19 crc kubenswrapper[4744]: I0930 03:30:19.937090 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25dtx"] Sep 30 03:30:21 crc kubenswrapper[4744]: I0930 03:30:21.782421 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25dtx" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="registry-server" containerID="cri-o://ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098" gracePeriod=2 Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.313059 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.429337 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds5qn\" (UniqueName: \"kubernetes.io/projected/627efd38-311b-4fc4-a852-82e86a1be04a-kube-api-access-ds5qn\") pod \"627efd38-311b-4fc4-a852-82e86a1be04a\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.429545 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-catalog-content\") pod \"627efd38-311b-4fc4-a852-82e86a1be04a\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.429690 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-utilities\") pod \"627efd38-311b-4fc4-a852-82e86a1be04a\" (UID: \"627efd38-311b-4fc4-a852-82e86a1be04a\") " Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.433307 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-utilities" (OuterVolumeSpecName: "utilities") pod "627efd38-311b-4fc4-a852-82e86a1be04a" (UID: "627efd38-311b-4fc4-a852-82e86a1be04a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.435159 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627efd38-311b-4fc4-a852-82e86a1be04a-kube-api-access-ds5qn" (OuterVolumeSpecName: "kube-api-access-ds5qn") pod "627efd38-311b-4fc4-a852-82e86a1be04a" (UID: "627efd38-311b-4fc4-a852-82e86a1be04a"). InnerVolumeSpecName "kube-api-access-ds5qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.480996 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "627efd38-311b-4fc4-a852-82e86a1be04a" (UID: "627efd38-311b-4fc4-a852-82e86a1be04a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.535170 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.535246 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627efd38-311b-4fc4-a852-82e86a1be04a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.535270 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds5qn\" (UniqueName: \"kubernetes.io/projected/627efd38-311b-4fc4-a852-82e86a1be04a-kube-api-access-ds5qn\") on node \"crc\" DevicePath \"\"" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.795221 4744 generic.go:334] "Generic (PLEG): container finished" podID="627efd38-311b-4fc4-a852-82e86a1be04a" containerID="ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098" exitCode=0 Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.795404 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25dtx" event={"ID":"627efd38-311b-4fc4-a852-82e86a1be04a","Type":"ContainerDied","Data":"ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098"} Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.795518 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25dtx" event={"ID":"627efd38-311b-4fc4-a852-82e86a1be04a","Type":"ContainerDied","Data":"6bc678ef1dd15e95fb7bb02233b382e8b82986b0857b9cbc98a2e8919d2958da"} Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.795565 4744 scope.go:117] "RemoveContainer" containerID="ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.795517 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25dtx" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.861013 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25dtx"] Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.866157 4744 scope.go:117] "RemoveContainer" containerID="a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.873634 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25dtx"] Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.904882 4744 scope.go:117] "RemoveContainer" containerID="2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.966973 4744 scope.go:117] "RemoveContainer" containerID="ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098" Sep 30 03:30:22 crc kubenswrapper[4744]: E0930 03:30:22.967714 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098\": container with ID starting with ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098 not found: ID does not exist" containerID="ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.967779 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098"} err="failed to get container status \"ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098\": rpc error: code = NotFound desc = could not find container \"ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098\": container with ID starting with ae1105ccd4065077e04e9e6f3674409b433590c870272666b4d0b58798125098 not found: ID does not exist" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.967811 4744 scope.go:117] "RemoveContainer" containerID="a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84" Sep 30 03:30:22 crc kubenswrapper[4744]: E0930 03:30:22.968415 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84\": container with ID starting with a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84 not found: ID does not exist" containerID="a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.968469 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84"} err="failed to get container status \"a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84\": rpc error: code = NotFound desc = could not find container \"a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84\": container with ID starting with a35f1f603353fde55797c351bba99947caf60e03409755dd0df33e076e7f3e84 not found: ID does not exist" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.968501 4744 scope.go:117] "RemoveContainer" containerID="2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8" Sep 30 03:30:22 crc kubenswrapper[4744]: E0930 03:30:22.968988 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8\": container with ID starting with 2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8 not found: ID does not exist" containerID="2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8" Sep 30 03:30:22 crc kubenswrapper[4744]: I0930 03:30:22.969024 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8"} err="failed to get container status \"2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8\": rpc error: code = NotFound desc = could not find container \"2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8\": container with ID starting with 2966170d310036349c06f63a70805c3a9611055a4a54519d76997ddea5a311c8 not found: ID does not exist" Sep 30 03:30:23 crc kubenswrapper[4744]: I0930 03:30:23.522901 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" path="/var/lib/kubelet/pods/627efd38-311b-4fc4-a852-82e86a1be04a/volumes" Sep 30 03:30:34 crc kubenswrapper[4744]: I0930 03:30:34.348328 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:30:34 crc kubenswrapper[4744]: I0930 03:30:34.349193 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:30:41 crc kubenswrapper[4744]: I0930 03:30:41.356173 4744 scope.go:117] "RemoveContainer" containerID="3f96b830334a792bece915589388a820996eb1f7aac1428b07a7c6bdb6871a19" Sep 30 03:31:04 crc kubenswrapper[4744]: I0930 03:31:04.347758 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:31:04 crc kubenswrapper[4744]: I0930 03:31:04.348338 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:31:04 crc kubenswrapper[4744]: I0930 03:31:04.348405 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:31:04 crc kubenswrapper[4744]: I0930 03:31:04.349050 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:31:04 crc kubenswrapper[4744]: I0930 03:31:04.349117 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" gracePeriod=600 Sep 30 03:31:04 crc kubenswrapper[4744]: E0930 03:31:04.479029 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:31:05 crc kubenswrapper[4744]: I0930 03:31:05.285994 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" exitCode=0 Sep 30 03:31:05 crc kubenswrapper[4744]: I0930 03:31:05.286033 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73"} Sep 30 03:31:05 crc kubenswrapper[4744]: I0930 03:31:05.286065 4744 scope.go:117] "RemoveContainer" containerID="fe2e0c31b2f11e084705476ec0ebe78e94be3c2f8bdcb24273e81b7e0e5969e9" Sep 30 03:31:05 crc kubenswrapper[4744]: I0930 03:31:05.286848 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:31:05 crc kubenswrapper[4744]: E0930 03:31:05.287218 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:31:20 crc kubenswrapper[4744]: I0930 03:31:20.503231 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:31:20 crc kubenswrapper[4744]: E0930 03:31:20.504045 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:31:31 crc kubenswrapper[4744]: I0930 03:31:31.505167 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:31:31 crc kubenswrapper[4744]: E0930 03:31:31.506183 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:31:43 crc kubenswrapper[4744]: I0930 03:31:43.516888 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:31:43 crc kubenswrapper[4744]: E0930 03:31:43.517927 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:31:54 crc kubenswrapper[4744]: I0930 03:31:54.504751 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:31:54 crc kubenswrapper[4744]: E0930 03:31:54.505850 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:32:08 crc kubenswrapper[4744]: I0930 03:32:08.504654 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:32:08 crc kubenswrapper[4744]: E0930 03:32:08.506041 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:32:21 crc kubenswrapper[4744]: I0930 03:32:21.504889 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:32:21 crc kubenswrapper[4744]: E0930 03:32:21.506095 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:32:33 crc kubenswrapper[4744]: I0930 03:32:33.509422 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:32:33 crc kubenswrapper[4744]: E0930 03:32:33.510189 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:32:47 crc kubenswrapper[4744]: I0930 03:32:47.504897 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:32:47 crc kubenswrapper[4744]: E0930 03:32:47.505988 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:32:59 crc kubenswrapper[4744]: I0930 03:32:59.505007 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:32:59 crc kubenswrapper[4744]: E0930 03:32:59.506298 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:33:12 crc kubenswrapper[4744]: I0930 03:33:12.503535 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:33:12 crc kubenswrapper[4744]: E0930 03:33:12.504582 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:33:25 crc kubenswrapper[4744]: I0930 03:33:25.505559 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:33:25 crc kubenswrapper[4744]: E0930 03:33:25.506692 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:33:40 crc kubenswrapper[4744]: I0930 03:33:40.505111 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:33:40 crc kubenswrapper[4744]: E0930 03:33:40.506335 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:33:54 crc kubenswrapper[4744]: I0930 03:33:54.504038 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:33:54 crc kubenswrapper[4744]: E0930 03:33:54.505101 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:34:06 crc kubenswrapper[4744]: I0930 03:34:06.503451 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:34:06 crc kubenswrapper[4744]: E0930 03:34:06.504521 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:34:17 crc kubenswrapper[4744]: I0930 03:34:17.503692 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:34:17 crc kubenswrapper[4744]: E0930 03:34:17.504475 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:34:28 crc kubenswrapper[4744]: I0930 03:34:28.503578 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:34:28 crc kubenswrapper[4744]: E0930 03:34:28.506613 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:34:32 crc kubenswrapper[4744]: I0930 03:34:32.776922 4744 generic.go:334] "Generic (PLEG): container finished" podID="fc1867d3-bb6f-4fca-876d-b868bcd284bb" containerID="dd8f67efeaa3771e0b53cbdf32ae97dfa2d0e20f6763bf70fe67b79979edf6f1" exitCode=0 Sep 30 03:34:32 crc kubenswrapper[4744]: I0930 03:34:32.777797 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" event={"ID":"fc1867d3-bb6f-4fca-876d-b868bcd284bb","Type":"ContainerDied","Data":"dd8f67efeaa3771e0b53cbdf32ae97dfa2d0e20f6763bf70fe67b79979edf6f1"} Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.350158 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.511350 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-inventory\") pod \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.511426 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-secret-0\") pod \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.511466 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz7sk\" (UniqueName: \"kubernetes.io/projected/fc1867d3-bb6f-4fca-876d-b868bcd284bb-kube-api-access-gz7sk\") pod \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.511577 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-ssh-key\") pod \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.511633 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-combined-ca-bundle\") pod \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\" (UID: \"fc1867d3-bb6f-4fca-876d-b868bcd284bb\") " Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.517046 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1867d3-bb6f-4fca-876d-b868bcd284bb-kube-api-access-gz7sk" (OuterVolumeSpecName: "kube-api-access-gz7sk") pod "fc1867d3-bb6f-4fca-876d-b868bcd284bb" (UID: "fc1867d3-bb6f-4fca-876d-b868bcd284bb"). InnerVolumeSpecName "kube-api-access-gz7sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.519110 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fc1867d3-bb6f-4fca-876d-b868bcd284bb" (UID: "fc1867d3-bb6f-4fca-876d-b868bcd284bb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.540405 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc1867d3-bb6f-4fca-876d-b868bcd284bb" (UID: "fc1867d3-bb6f-4fca-876d-b868bcd284bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.562473 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-inventory" (OuterVolumeSpecName: "inventory") pod "fc1867d3-bb6f-4fca-876d-b868bcd284bb" (UID: "fc1867d3-bb6f-4fca-876d-b868bcd284bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.567071 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "fc1867d3-bb6f-4fca-876d-b868bcd284bb" (UID: "fc1867d3-bb6f-4fca-876d-b868bcd284bb"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.615563 4744 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.615638 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz7sk\" (UniqueName: \"kubernetes.io/projected/fc1867d3-bb6f-4fca-876d-b868bcd284bb-kube-api-access-gz7sk\") on node \"crc\" DevicePath \"\"" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.615664 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.615688 4744 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.615712 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc1867d3-bb6f-4fca-876d-b868bcd284bb-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.802546 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" event={"ID":"fc1867d3-bb6f-4fca-876d-b868bcd284bb","Type":"ContainerDied","Data":"2068a3c5550af8b923461be774f2f9ebad5f2d03c941bd6343cff84407f7e92d"} Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.802591 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2068a3c5550af8b923461be774f2f9ebad5f2d03c941bd6343cff84407f7e92d" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.802653 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.927854 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd"] Sep 30 03:34:34 crc kubenswrapper[4744]: E0930 03:34:34.928536 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1867d3-bb6f-4fca-876d-b868bcd284bb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.928555 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1867d3-bb6f-4fca-876d-b868bcd284bb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 03:34:34 crc kubenswrapper[4744]: E0930 03:34:34.928579 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="extract-utilities" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.928589 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="extract-utilities" Sep 30 03:34:34 crc kubenswrapper[4744]: E0930 03:34:34.928612 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="extract-content" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.928620 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="extract-content" Sep 30 03:34:34 crc kubenswrapper[4744]: E0930 03:34:34.928644 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="registry-server" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.928651 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="registry-server" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.928890 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1867d3-bb6f-4fca-876d-b868bcd284bb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.928915 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="627efd38-311b-4fc4-a852-82e86a1be04a" containerName="registry-server" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.930103 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.933857 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.934125 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.940496 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.940596 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.940497 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.940602 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.941107 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:34:34 crc kubenswrapper[4744]: I0930 03:34:34.968153 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd"] Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.023655 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.024234 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.024549 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25mf9\" (UniqueName: \"kubernetes.io/projected/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-kube-api-access-25mf9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.024663 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.024696 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.024744 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.024975 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.025048 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.025115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127634 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127687 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127715 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127773 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127887 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25mf9\" (UniqueName: \"kubernetes.io/projected/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-kube-api-access-25mf9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127910 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127926 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.127946 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.129357 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.132886 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.133051 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.133720 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.134275 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.134628 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.135168 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.135714 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.156739 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25mf9\" (UniqueName: \"kubernetes.io/projected/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-kube-api-access-25mf9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qj5zd\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.272877 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.887827 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd"] Sep 30 03:34:35 crc kubenswrapper[4744]: I0930 03:34:35.891989 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:34:36 crc kubenswrapper[4744]: I0930 03:34:36.827671 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" event={"ID":"a0abbf7a-4bd2-4a60-a571-68eae4ea321c","Type":"ContainerStarted","Data":"dcc91a5b7333975816b453e77d498f09ebfaa3cfd4253177f04aa6fdf2ed8c16"} Sep 30 03:34:36 crc kubenswrapper[4744]: I0930 03:34:36.828332 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" event={"ID":"a0abbf7a-4bd2-4a60-a571-68eae4ea321c","Type":"ContainerStarted","Data":"6785ebd190e41b961a68c3b74002642d1c2c2f4d13bbc49c645104b58889bb1a"} Sep 30 03:34:36 crc kubenswrapper[4744]: I0930 03:34:36.863192 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" podStartSLOduration=2.327748589 podStartE2EDuration="2.8631641s" podCreationTimestamp="2025-09-30 03:34:34 +0000 UTC" firstStartedPulling="2025-09-30 03:34:35.891354284 +0000 UTC m=+2403.064574308" lastFinishedPulling="2025-09-30 03:34:36.426769815 +0000 UTC m=+2403.599989819" observedRunningTime="2025-09-30 03:34:36.852414266 +0000 UTC m=+2404.025634280" watchObservedRunningTime="2025-09-30 03:34:36.8631641 +0000 UTC m=+2404.036384114" Sep 30 03:34:39 crc kubenswrapper[4744]: I0930 03:34:39.504173 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:34:39 crc kubenswrapper[4744]: E0930 03:34:39.505132 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:34:51 crc kubenswrapper[4744]: I0930 03:34:51.504623 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:34:51 crc kubenswrapper[4744]: E0930 03:34:51.507007 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:35:03 crc kubenswrapper[4744]: I0930 03:35:03.531454 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:35:03 crc kubenswrapper[4744]: E0930 03:35:03.535675 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.675794 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bp2b6"] Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.681078 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.706316 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp2b6"] Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.785212 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-catalog-content\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.785331 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-utilities\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.785451 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6kz7\" (UniqueName: \"kubernetes.io/projected/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-kube-api-access-f6kz7\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.890024 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-catalog-content\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.890518 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-catalog-content\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.891345 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-utilities\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.891655 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-utilities\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.891733 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6kz7\" (UniqueName: \"kubernetes.io/projected/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-kube-api-access-f6kz7\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:14 crc kubenswrapper[4744]: I0930 03:35:14.909736 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6kz7\" (UniqueName: \"kubernetes.io/projected/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-kube-api-access-f6kz7\") pod \"redhat-marketplace-bp2b6\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:15 crc kubenswrapper[4744]: I0930 03:35:15.010663 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:15 crc kubenswrapper[4744]: I0930 03:35:15.459677 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp2b6"] Sep 30 03:35:15 crc kubenswrapper[4744]: W0930 03:35:15.462660 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d5fefa_ab79_4588_9f2e_83a7d5d91a21.slice/crio-6d518020e34a5d8f3c8ca4c0f73833c3f192a9c106cf3440a25d7ff13a1e27d9 WatchSource:0}: Error finding container 6d518020e34a5d8f3c8ca4c0f73833c3f192a9c106cf3440a25d7ff13a1e27d9: Status 404 returned error can't find the container with id 6d518020e34a5d8f3c8ca4c0f73833c3f192a9c106cf3440a25d7ff13a1e27d9 Sep 30 03:35:16 crc kubenswrapper[4744]: I0930 03:35:16.321160 4744 generic.go:334] "Generic (PLEG): container finished" podID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerID="a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b" exitCode=0 Sep 30 03:35:16 crc kubenswrapper[4744]: I0930 03:35:16.321218 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp2b6" event={"ID":"38d5fefa-ab79-4588-9f2e-83a7d5d91a21","Type":"ContainerDied","Data":"a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b"} Sep 30 03:35:16 crc kubenswrapper[4744]: I0930 03:35:16.321679 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp2b6" event={"ID":"38d5fefa-ab79-4588-9f2e-83a7d5d91a21","Type":"ContainerStarted","Data":"6d518020e34a5d8f3c8ca4c0f73833c3f192a9c106cf3440a25d7ff13a1e27d9"} Sep 30 03:35:16 crc kubenswrapper[4744]: I0930 03:35:16.504415 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:35:16 crc kubenswrapper[4744]: E0930 03:35:16.504868 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:35:17 crc kubenswrapper[4744]: I0930 03:35:17.332654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp2b6" event={"ID":"38d5fefa-ab79-4588-9f2e-83a7d5d91a21","Type":"ContainerStarted","Data":"93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047"} Sep 30 03:35:18 crc kubenswrapper[4744]: I0930 03:35:18.347437 4744 generic.go:334] "Generic (PLEG): container finished" podID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerID="93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047" exitCode=0 Sep 30 03:35:18 crc kubenswrapper[4744]: I0930 03:35:18.347592 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp2b6" event={"ID":"38d5fefa-ab79-4588-9f2e-83a7d5d91a21","Type":"ContainerDied","Data":"93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047"} Sep 30 03:35:19 crc kubenswrapper[4744]: I0930 03:35:19.359523 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp2b6" event={"ID":"38d5fefa-ab79-4588-9f2e-83a7d5d91a21","Type":"ContainerStarted","Data":"5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856"} Sep 30 03:35:19 crc kubenswrapper[4744]: I0930 03:35:19.384364 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bp2b6" podStartSLOduration=2.668841498 podStartE2EDuration="5.384343434s" podCreationTimestamp="2025-09-30 03:35:14 +0000 UTC" firstStartedPulling="2025-09-30 03:35:16.322961748 +0000 UTC m=+2443.496181722" lastFinishedPulling="2025-09-30 03:35:19.038463644 +0000 UTC m=+2446.211683658" observedRunningTime="2025-09-30 03:35:19.38002723 +0000 UTC m=+2446.553247234" watchObservedRunningTime="2025-09-30 03:35:19.384343434 +0000 UTC m=+2446.557563418" Sep 30 03:35:25 crc kubenswrapper[4744]: I0930 03:35:25.011461 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:25 crc kubenswrapper[4744]: I0930 03:35:25.013703 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:25 crc kubenswrapper[4744]: I0930 03:35:25.080017 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:25 crc kubenswrapper[4744]: I0930 03:35:25.518255 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:25 crc kubenswrapper[4744]: I0930 03:35:25.567797 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp2b6"] Sep 30 03:35:27 crc kubenswrapper[4744]: I0930 03:35:27.457684 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bp2b6" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="registry-server" containerID="cri-o://5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856" gracePeriod=2 Sep 30 03:35:27 crc kubenswrapper[4744]: I0930 03:35:27.504865 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:35:27 crc kubenswrapper[4744]: E0930 03:35:27.505064 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:35:27 crc kubenswrapper[4744]: I0930 03:35:27.976207 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.081757 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-utilities\") pod \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.081856 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6kz7\" (UniqueName: \"kubernetes.io/projected/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-kube-api-access-f6kz7\") pod \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.081939 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-catalog-content\") pod \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\" (UID: \"38d5fefa-ab79-4588-9f2e-83a7d5d91a21\") " Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.082732 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-utilities" (OuterVolumeSpecName: "utilities") pod "38d5fefa-ab79-4588-9f2e-83a7d5d91a21" (UID: "38d5fefa-ab79-4588-9f2e-83a7d5d91a21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.087024 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-kube-api-access-f6kz7" (OuterVolumeSpecName: "kube-api-access-f6kz7") pod "38d5fefa-ab79-4588-9f2e-83a7d5d91a21" (UID: "38d5fefa-ab79-4588-9f2e-83a7d5d91a21"). InnerVolumeSpecName "kube-api-access-f6kz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.103054 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38d5fefa-ab79-4588-9f2e-83a7d5d91a21" (UID: "38d5fefa-ab79-4588-9f2e-83a7d5d91a21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.184794 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.184838 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6kz7\" (UniqueName: \"kubernetes.io/projected/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-kube-api-access-f6kz7\") on node \"crc\" DevicePath \"\"" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.184875 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d5fefa-ab79-4588-9f2e-83a7d5d91a21-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.480427 4744 generic.go:334] "Generic (PLEG): container finished" podID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerID="5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856" exitCode=0 Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.480515 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bp2b6" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.480542 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp2b6" event={"ID":"38d5fefa-ab79-4588-9f2e-83a7d5d91a21","Type":"ContainerDied","Data":"5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856"} Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.480989 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bp2b6" event={"ID":"38d5fefa-ab79-4588-9f2e-83a7d5d91a21","Type":"ContainerDied","Data":"6d518020e34a5d8f3c8ca4c0f73833c3f192a9c106cf3440a25d7ff13a1e27d9"} Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.481023 4744 scope.go:117] "RemoveContainer" containerID="5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.525841 4744 scope.go:117] "RemoveContainer" containerID="93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.555902 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp2b6"] Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.557505 4744 scope.go:117] "RemoveContainer" containerID="a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.583032 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bp2b6"] Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.620966 4744 scope.go:117] "RemoveContainer" containerID="5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856" Sep 30 03:35:28 crc kubenswrapper[4744]: E0930 03:35:28.621569 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856\": container with ID starting with 5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856 not found: ID does not exist" containerID="5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.621604 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856"} err="failed to get container status \"5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856\": rpc error: code = NotFound desc = could not find container \"5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856\": container with ID starting with 5d0133bfe96b4990bd92d155a0451e5810c04e1455805799af4c8095cdcfd856 not found: ID does not exist" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.621628 4744 scope.go:117] "RemoveContainer" containerID="93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047" Sep 30 03:35:28 crc kubenswrapper[4744]: E0930 03:35:28.621958 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047\": container with ID starting with 93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047 not found: ID does not exist" containerID="93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.621985 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047"} err="failed to get container status \"93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047\": rpc error: code = NotFound desc = could not find container \"93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047\": container with ID starting with 93f8eeb629c27777f9870a4e501b483cece3d8fd9067804b5d0c48e83165e047 not found: ID does not exist" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.622002 4744 scope.go:117] "RemoveContainer" containerID="a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b" Sep 30 03:35:28 crc kubenswrapper[4744]: E0930 03:35:28.622600 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b\": container with ID starting with a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b not found: ID does not exist" containerID="a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b" Sep 30 03:35:28 crc kubenswrapper[4744]: I0930 03:35:28.622628 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b"} err="failed to get container status \"a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b\": rpc error: code = NotFound desc = could not find container \"a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b\": container with ID starting with a2f5d789c65a1644e4928f98159376b2fff7a45de72a04c186bdc704bbb0678b not found: ID does not exist" Sep 30 03:35:29 crc kubenswrapper[4744]: I0930 03:35:29.544677 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" path="/var/lib/kubelet/pods/38d5fefa-ab79-4588-9f2e-83a7d5d91a21/volumes" Sep 30 03:35:41 crc kubenswrapper[4744]: I0930 03:35:41.504754 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:35:41 crc kubenswrapper[4744]: E0930 03:35:41.505748 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:35:56 crc kubenswrapper[4744]: I0930 03:35:56.504709 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:35:56 crc kubenswrapper[4744]: E0930 03:35:56.505844 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:36:08 crc kubenswrapper[4744]: I0930 03:36:08.503913 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:36:08 crc kubenswrapper[4744]: I0930 03:36:08.968352 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"dc88f8e7a76aaeb4ef100e7ac49085589ff7268236e04a20eee97f324ef466a8"} Sep 30 03:38:16 crc kubenswrapper[4744]: I0930 03:38:16.399127 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0abbf7a-4bd2-4a60-a571-68eae4ea321c" containerID="dcc91a5b7333975816b453e77d498f09ebfaa3cfd4253177f04aa6fdf2ed8c16" exitCode=0 Sep 30 03:38:16 crc kubenswrapper[4744]: I0930 03:38:16.399249 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" event={"ID":"a0abbf7a-4bd2-4a60-a571-68eae4ea321c","Type":"ContainerDied","Data":"dcc91a5b7333975816b453e77d498f09ebfaa3cfd4253177f04aa6fdf2ed8c16"} Sep 30 03:38:17 crc kubenswrapper[4744]: I0930 03:38:17.859670 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.018878 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-0\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.018981 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-extra-config-0\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.019102 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25mf9\" (UniqueName: \"kubernetes.io/projected/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-kube-api-access-25mf9\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.019293 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-1\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.019342 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-inventory\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.019433 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-combined-ca-bundle\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.019526 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-1\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.019563 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-ssh-key\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.019683 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-0\") pod \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\" (UID: \"a0abbf7a-4bd2-4a60-a571-68eae4ea321c\") " Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.027009 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-kube-api-access-25mf9" (OuterVolumeSpecName: "kube-api-access-25mf9") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "kube-api-access-25mf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.039510 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.062677 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.062771 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.073886 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.080048 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.083495 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.086297 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-inventory" (OuterVolumeSpecName: "inventory") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.105525 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a0abbf7a-4bd2-4a60-a571-68eae4ea321c" (UID: "a0abbf7a-4bd2-4a60-a571-68eae4ea321c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123517 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25mf9\" (UniqueName: \"kubernetes.io/projected/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-kube-api-access-25mf9\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123548 4744 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123561 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123571 4744 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123579 4744 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123588 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123597 4744 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123606 4744 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.123616 4744 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a0abbf7a-4bd2-4a60-a571-68eae4ea321c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.425275 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" event={"ID":"a0abbf7a-4bd2-4a60-a571-68eae4ea321c","Type":"ContainerDied","Data":"6785ebd190e41b961a68c3b74002642d1c2c2f4d13bbc49c645104b58889bb1a"} Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.425314 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6785ebd190e41b961a68c3b74002642d1c2c2f4d13bbc49c645104b58889bb1a" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.425428 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qj5zd" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.558224 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts"] Sep 30 03:38:18 crc kubenswrapper[4744]: E0930 03:38:18.558797 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="registry-server" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.558831 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="registry-server" Sep 30 03:38:18 crc kubenswrapper[4744]: E0930 03:38:18.558861 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="extract-utilities" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.558875 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="extract-utilities" Sep 30 03:38:18 crc kubenswrapper[4744]: E0930 03:38:18.558896 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="extract-content" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.558908 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="extract-content" Sep 30 03:38:18 crc kubenswrapper[4744]: E0930 03:38:18.558958 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0abbf7a-4bd2-4a60-a571-68eae4ea321c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.558970 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0abbf7a-4bd2-4a60-a571-68eae4ea321c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.559252 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0abbf7a-4bd2-4a60-a571-68eae4ea321c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.559282 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d5fefa-ab79-4588-9f2e-83a7d5d91a21" containerName="registry-server" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.560001 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts"] Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.560092 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.577717 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g4nzl" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.578013 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.577898 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.578429 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.577952 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.736173 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.736217 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.736261 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.736292 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.736310 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn7cn\" (UniqueName: \"kubernetes.io/projected/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-kube-api-access-nn7cn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.736360 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.736439 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.838354 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.838656 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn7cn\" (UniqueName: \"kubernetes.io/projected/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-kube-api-access-nn7cn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.838764 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.838877 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.838953 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.838980 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.839033 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.842793 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.842874 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.853482 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.854212 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.855708 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.859845 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.866760 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn7cn\" (UniqueName: \"kubernetes.io/projected/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-kube-api-access-nn7cn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rkxts\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:18 crc kubenswrapper[4744]: I0930 03:38:18.890363 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:38:19 crc kubenswrapper[4744]: I0930 03:38:19.405950 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts"] Sep 30 03:38:19 crc kubenswrapper[4744]: I0930 03:38:19.437121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" event={"ID":"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c","Type":"ContainerStarted","Data":"f8601574d8c29dc8df6425bbfcb7415a4e058f9bd99fa4db52245bcdd54bf1dd"} Sep 30 03:38:20 crc kubenswrapper[4744]: I0930 03:38:20.449086 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" event={"ID":"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c","Type":"ContainerStarted","Data":"5205f42bd5ec51a3c39af60a86f4554a4376f5594b6959c33bea5f8ba21595d9"} Sep 30 03:38:20 crc kubenswrapper[4744]: I0930 03:38:20.476406 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" podStartSLOduration=2.01150882 podStartE2EDuration="2.476387269s" podCreationTimestamp="2025-09-30 03:38:18 +0000 UTC" firstStartedPulling="2025-09-30 03:38:19.42277955 +0000 UTC m=+2626.595999544" lastFinishedPulling="2025-09-30 03:38:19.887658019 +0000 UTC m=+2627.060877993" observedRunningTime="2025-09-30 03:38:20.470053872 +0000 UTC m=+2627.643273886" watchObservedRunningTime="2025-09-30 03:38:20.476387269 +0000 UTC m=+2627.649607253" Sep 30 03:38:34 crc kubenswrapper[4744]: I0930 03:38:34.347657 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:38:34 crc kubenswrapper[4744]: I0930 03:38:34.348324 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:39:04 crc kubenswrapper[4744]: I0930 03:39:04.347721 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:39:04 crc kubenswrapper[4744]: I0930 03:39:04.348283 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.355769 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zgz2n"] Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.361505 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.371976 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgz2n"] Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.539244 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-utilities\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.539342 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x8ng\" (UniqueName: \"kubernetes.io/projected/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-kube-api-access-7x8ng\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.539516 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-catalog-content\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.641553 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-catalog-content\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.641802 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-utilities\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.641859 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x8ng\" (UniqueName: \"kubernetes.io/projected/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-kube-api-access-7x8ng\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.643424 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-catalog-content\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.645338 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-utilities\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.678464 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x8ng\" (UniqueName: \"kubernetes.io/projected/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-kube-api-access-7x8ng\") pod \"certified-operators-zgz2n\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:31 crc kubenswrapper[4744]: I0930 03:39:31.693707 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:32 crc kubenswrapper[4744]: I0930 03:39:32.214904 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgz2n"] Sep 30 03:39:32 crc kubenswrapper[4744]: W0930 03:39:32.217955 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ba844b_e33a_4d8b_ba17_c33f4bfd7fa9.slice/crio-4c3c62779d62afd627367183b52b76d1fbd6b630c64b3516c1234b78168c767c WatchSource:0}: Error finding container 4c3c62779d62afd627367183b52b76d1fbd6b630c64b3516c1234b78168c767c: Status 404 returned error can't find the container with id 4c3c62779d62afd627367183b52b76d1fbd6b630c64b3516c1234b78168c767c Sep 30 03:39:32 crc kubenswrapper[4744]: I0930 03:39:32.339944 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgz2n" event={"ID":"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9","Type":"ContainerStarted","Data":"4c3c62779d62afd627367183b52b76d1fbd6b630c64b3516c1234b78168c767c"} Sep 30 03:39:33 crc kubenswrapper[4744]: I0930 03:39:33.352541 4744 generic.go:334] "Generic (PLEG): container finished" podID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerID="1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef" exitCode=0 Sep 30 03:39:33 crc kubenswrapper[4744]: I0930 03:39:33.352664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgz2n" event={"ID":"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9","Type":"ContainerDied","Data":"1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef"} Sep 30 03:39:34 crc kubenswrapper[4744]: I0930 03:39:34.348014 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:39:34 crc kubenswrapper[4744]: I0930 03:39:34.348429 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:39:34 crc kubenswrapper[4744]: I0930 03:39:34.348493 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:39:34 crc kubenswrapper[4744]: I0930 03:39:34.349682 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc88f8e7a76aaeb4ef100e7ac49085589ff7268236e04a20eee97f324ef466a8"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:39:34 crc kubenswrapper[4744]: I0930 03:39:34.349817 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://dc88f8e7a76aaeb4ef100e7ac49085589ff7268236e04a20eee97f324ef466a8" gracePeriod=600 Sep 30 03:39:34 crc kubenswrapper[4744]: I0930 03:39:34.365619 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgz2n" event={"ID":"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9","Type":"ContainerStarted","Data":"29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75"} Sep 30 03:39:35 crc kubenswrapper[4744]: I0930 03:39:35.381137 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="dc88f8e7a76aaeb4ef100e7ac49085589ff7268236e04a20eee97f324ef466a8" exitCode=0 Sep 30 03:39:35 crc kubenswrapper[4744]: I0930 03:39:35.381331 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"dc88f8e7a76aaeb4ef100e7ac49085589ff7268236e04a20eee97f324ef466a8"} Sep 30 03:39:35 crc kubenswrapper[4744]: I0930 03:39:35.381931 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194"} Sep 30 03:39:35 crc kubenswrapper[4744]: I0930 03:39:35.381969 4744 scope.go:117] "RemoveContainer" containerID="797d84b1ed17dda853954dac81631a2a66df86d823496561074357ce7dfb4b73" Sep 30 03:39:35 crc kubenswrapper[4744]: I0930 03:39:35.397612 4744 generic.go:334] "Generic (PLEG): container finished" podID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerID="29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75" exitCode=0 Sep 30 03:39:35 crc kubenswrapper[4744]: I0930 03:39:35.397665 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgz2n" event={"ID":"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9","Type":"ContainerDied","Data":"29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75"} Sep 30 03:39:36 crc kubenswrapper[4744]: I0930 03:39:36.445077 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgz2n" event={"ID":"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9","Type":"ContainerStarted","Data":"fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64"} Sep 30 03:39:36 crc kubenswrapper[4744]: I0930 03:39:36.466956 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zgz2n" podStartSLOduration=2.933813002 podStartE2EDuration="5.466931269s" podCreationTimestamp="2025-09-30 03:39:31 +0000 UTC" firstStartedPulling="2025-09-30 03:39:33.355314321 +0000 UTC m=+2700.528534295" lastFinishedPulling="2025-09-30 03:39:35.888432558 +0000 UTC m=+2703.061652562" observedRunningTime="2025-09-30 03:39:36.466397343 +0000 UTC m=+2703.639617337" watchObservedRunningTime="2025-09-30 03:39:36.466931269 +0000 UTC m=+2703.640151283" Sep 30 03:39:41 crc kubenswrapper[4744]: I0930 03:39:41.694217 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:41 crc kubenswrapper[4744]: I0930 03:39:41.694812 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:41 crc kubenswrapper[4744]: I0930 03:39:41.764923 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:42 crc kubenswrapper[4744]: I0930 03:39:42.585127 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:42 crc kubenswrapper[4744]: I0930 03:39:42.646572 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgz2n"] Sep 30 03:39:44 crc kubenswrapper[4744]: I0930 03:39:44.543922 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zgz2n" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="registry-server" containerID="cri-o://fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64" gracePeriod=2 Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.053008 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.179553 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-catalog-content\") pod \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.179628 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-utilities\") pod \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.179769 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x8ng\" (UniqueName: \"kubernetes.io/projected/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-kube-api-access-7x8ng\") pod \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\" (UID: \"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9\") " Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.181399 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-utilities" (OuterVolumeSpecName: "utilities") pod "27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" (UID: "27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.188310 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-kube-api-access-7x8ng" (OuterVolumeSpecName: "kube-api-access-7x8ng") pod "27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" (UID: "27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9"). InnerVolumeSpecName "kube-api-access-7x8ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.256020 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" (UID: "27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.283302 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.283351 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.283397 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x8ng\" (UniqueName: \"kubernetes.io/projected/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9-kube-api-access-7x8ng\") on node \"crc\" DevicePath \"\"" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.565188 4744 generic.go:334] "Generic (PLEG): container finished" podID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerID="fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64" exitCode=0 Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.565252 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgz2n" event={"ID":"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9","Type":"ContainerDied","Data":"fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64"} Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.565291 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgz2n" event={"ID":"27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9","Type":"ContainerDied","Data":"4c3c62779d62afd627367183b52b76d1fbd6b630c64b3516c1234b78168c767c"} Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.565319 4744 scope.go:117] "RemoveContainer" containerID="fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.565319 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgz2n" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.605634 4744 scope.go:117] "RemoveContainer" containerID="29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.613892 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgz2n"] Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.628885 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zgz2n"] Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.631461 4744 scope.go:117] "RemoveContainer" containerID="1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.703547 4744 scope.go:117] "RemoveContainer" containerID="fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64" Sep 30 03:39:45 crc kubenswrapper[4744]: E0930 03:39:45.704092 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64\": container with ID starting with fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64 not found: ID does not exist" containerID="fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.704147 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64"} err="failed to get container status \"fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64\": rpc error: code = NotFound desc = could not find container \"fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64\": container with ID starting with fa487b9070dc515a39e2058154f352a3b0f5c443c4fa61ee0d7c8940c8a35b64 not found: ID does not exist" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.704181 4744 scope.go:117] "RemoveContainer" containerID="29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75" Sep 30 03:39:45 crc kubenswrapper[4744]: E0930 03:39:45.704811 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75\": container with ID starting with 29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75 not found: ID does not exist" containerID="29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.704838 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75"} err="failed to get container status \"29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75\": rpc error: code = NotFound desc = could not find container \"29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75\": container with ID starting with 29a1b6eed3451b8d116d3515d8d012817ffc3cf2bf6df011d2ff5d880cafed75 not found: ID does not exist" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.704887 4744 scope.go:117] "RemoveContainer" containerID="1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef" Sep 30 03:39:45 crc kubenswrapper[4744]: E0930 03:39:45.705206 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef\": container with ID starting with 1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef not found: ID does not exist" containerID="1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef" Sep 30 03:39:45 crc kubenswrapper[4744]: I0930 03:39:45.705261 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef"} err="failed to get container status \"1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef\": rpc error: code = NotFound desc = could not find container \"1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef\": container with ID starting with 1694893a271dc4da84311971fb8463edcfb098969f9ae1a773f19983660534ef not found: ID does not exist" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.424477 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rmjjr"] Sep 30 03:39:47 crc kubenswrapper[4744]: E0930 03:39:47.425873 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="extract-utilities" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.425897 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="extract-utilities" Sep 30 03:39:47 crc kubenswrapper[4744]: E0930 03:39:47.425963 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="extract-content" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.425976 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="extract-content" Sep 30 03:39:47 crc kubenswrapper[4744]: E0930 03:39:47.426006 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="registry-server" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.426019 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="registry-server" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.426364 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" containerName="registry-server" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.431166 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.434616 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rmjjr"] Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.518605 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9" path="/var/lib/kubelet/pods/27ba844b-e33a-4d8b-ba17-c33f4bfd7fa9/volumes" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.535457 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwcwk\" (UniqueName: \"kubernetes.io/projected/da315cdf-b092-43e9-a1e1-71de3d0c3282-kube-api-access-xwcwk\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.535555 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-utilities\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.535582 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-catalog-content\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.636844 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-utilities\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.637398 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-catalog-content\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.637643 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcwk\" (UniqueName: \"kubernetes.io/projected/da315cdf-b092-43e9-a1e1-71de3d0c3282-kube-api-access-xwcwk\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.639100 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-catalog-content\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.639205 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-utilities\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.655443 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcwk\" (UniqueName: \"kubernetes.io/projected/da315cdf-b092-43e9-a1e1-71de3d0c3282-kube-api-access-xwcwk\") pod \"redhat-operators-rmjjr\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:47 crc kubenswrapper[4744]: I0930 03:39:47.774303 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:48 crc kubenswrapper[4744]: I0930 03:39:48.265528 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rmjjr"] Sep 30 03:39:48 crc kubenswrapper[4744]: W0930 03:39:48.267879 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda315cdf_b092_43e9_a1e1_71de3d0c3282.slice/crio-8290c487215d1884c05e841119bb23a52f56319c64f3811a534f3a345360b7bf WatchSource:0}: Error finding container 8290c487215d1884c05e841119bb23a52f56319c64f3811a534f3a345360b7bf: Status 404 returned error can't find the container with id 8290c487215d1884c05e841119bb23a52f56319c64f3811a534f3a345360b7bf Sep 30 03:39:48 crc kubenswrapper[4744]: I0930 03:39:48.631199 4744 generic.go:334] "Generic (PLEG): container finished" podID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerID="c7f0ab35cac7965f7ff0babaea506860f52772b5124819e4ab1690c4706bd7cb" exitCode=0 Sep 30 03:39:48 crc kubenswrapper[4744]: I0930 03:39:48.631556 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmjjr" event={"ID":"da315cdf-b092-43e9-a1e1-71de3d0c3282","Type":"ContainerDied","Data":"c7f0ab35cac7965f7ff0babaea506860f52772b5124819e4ab1690c4706bd7cb"} Sep 30 03:39:48 crc kubenswrapper[4744]: I0930 03:39:48.631582 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmjjr" event={"ID":"da315cdf-b092-43e9-a1e1-71de3d0c3282","Type":"ContainerStarted","Data":"8290c487215d1884c05e841119bb23a52f56319c64f3811a534f3a345360b7bf"} Sep 30 03:39:48 crc kubenswrapper[4744]: I0930 03:39:48.640599 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:39:50 crc kubenswrapper[4744]: I0930 03:39:50.656412 4744 generic.go:334] "Generic (PLEG): container finished" podID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerID="add2915072424feddb6e1f9fef5dfef186d1c20c5b7fdf48187076a2daeb803d" exitCode=0 Sep 30 03:39:50 crc kubenswrapper[4744]: I0930 03:39:50.656497 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmjjr" event={"ID":"da315cdf-b092-43e9-a1e1-71de3d0c3282","Type":"ContainerDied","Data":"add2915072424feddb6e1f9fef5dfef186d1c20c5b7fdf48187076a2daeb803d"} Sep 30 03:39:51 crc kubenswrapper[4744]: I0930 03:39:51.673591 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmjjr" event={"ID":"da315cdf-b092-43e9-a1e1-71de3d0c3282","Type":"ContainerStarted","Data":"abf22e550945cc7080a593be9cbe2912d19724837328181606364f7e02dd344f"} Sep 30 03:39:51 crc kubenswrapper[4744]: I0930 03:39:51.705895 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rmjjr" podStartSLOduration=2.026000812 podStartE2EDuration="4.70587543s" podCreationTimestamp="2025-09-30 03:39:47 +0000 UTC" firstStartedPulling="2025-09-30 03:39:48.640255671 +0000 UTC m=+2715.813475645" lastFinishedPulling="2025-09-30 03:39:51.320130249 +0000 UTC m=+2718.493350263" observedRunningTime="2025-09-30 03:39:51.701153603 +0000 UTC m=+2718.874373617" watchObservedRunningTime="2025-09-30 03:39:51.70587543 +0000 UTC m=+2718.879095414" Sep 30 03:39:57 crc kubenswrapper[4744]: I0930 03:39:57.775219 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:57 crc kubenswrapper[4744]: I0930 03:39:57.775909 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:57 crc kubenswrapper[4744]: I0930 03:39:57.851016 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:58 crc kubenswrapper[4744]: I0930 03:39:58.824593 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:39:58 crc kubenswrapper[4744]: I0930 03:39:58.903972 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rmjjr"] Sep 30 03:40:00 crc kubenswrapper[4744]: I0930 03:40:00.775204 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rmjjr" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="registry-server" containerID="cri-o://abf22e550945cc7080a593be9cbe2912d19724837328181606364f7e02dd344f" gracePeriod=2 Sep 30 03:40:01 crc kubenswrapper[4744]: I0930 03:40:01.791958 4744 generic.go:334] "Generic (PLEG): container finished" podID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerID="abf22e550945cc7080a593be9cbe2912d19724837328181606364f7e02dd344f" exitCode=0 Sep 30 03:40:01 crc kubenswrapper[4744]: I0930 03:40:01.792070 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmjjr" event={"ID":"da315cdf-b092-43e9-a1e1-71de3d0c3282","Type":"ContainerDied","Data":"abf22e550945cc7080a593be9cbe2912d19724837328181606364f7e02dd344f"} Sep 30 03:40:01 crc kubenswrapper[4744]: I0930 03:40:01.792396 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rmjjr" event={"ID":"da315cdf-b092-43e9-a1e1-71de3d0c3282","Type":"ContainerDied","Data":"8290c487215d1884c05e841119bb23a52f56319c64f3811a534f3a345360b7bf"} Sep 30 03:40:01 crc kubenswrapper[4744]: I0930 03:40:01.792419 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8290c487215d1884c05e841119bb23a52f56319c64f3811a534f3a345360b7bf" Sep 30 03:40:01 crc kubenswrapper[4744]: I0930 03:40:01.881838 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.005974 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwcwk\" (UniqueName: \"kubernetes.io/projected/da315cdf-b092-43e9-a1e1-71de3d0c3282-kube-api-access-xwcwk\") pod \"da315cdf-b092-43e9-a1e1-71de3d0c3282\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.006191 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-utilities\") pod \"da315cdf-b092-43e9-a1e1-71de3d0c3282\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.006238 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-catalog-content\") pod \"da315cdf-b092-43e9-a1e1-71de3d0c3282\" (UID: \"da315cdf-b092-43e9-a1e1-71de3d0c3282\") " Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.007750 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-utilities" (OuterVolumeSpecName: "utilities") pod "da315cdf-b092-43e9-a1e1-71de3d0c3282" (UID: "da315cdf-b092-43e9-a1e1-71de3d0c3282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.012687 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da315cdf-b092-43e9-a1e1-71de3d0c3282-kube-api-access-xwcwk" (OuterVolumeSpecName: "kube-api-access-xwcwk") pod "da315cdf-b092-43e9-a1e1-71de3d0c3282" (UID: "da315cdf-b092-43e9-a1e1-71de3d0c3282"). InnerVolumeSpecName "kube-api-access-xwcwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.081629 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da315cdf-b092-43e9-a1e1-71de3d0c3282" (UID: "da315cdf-b092-43e9-a1e1-71de3d0c3282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.108165 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.108194 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da315cdf-b092-43e9-a1e1-71de3d0c3282-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.108206 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwcwk\" (UniqueName: \"kubernetes.io/projected/da315cdf-b092-43e9-a1e1-71de3d0c3282-kube-api-access-xwcwk\") on node \"crc\" DevicePath \"\"" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.805633 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rmjjr" Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.857518 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rmjjr"] Sep 30 03:40:02 crc kubenswrapper[4744]: I0930 03:40:02.873002 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rmjjr"] Sep 30 03:40:03 crc kubenswrapper[4744]: I0930 03:40:03.523034 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" path="/var/lib/kubelet/pods/da315cdf-b092-43e9-a1e1-71de3d0c3282/volumes" Sep 30 03:40:59 crc kubenswrapper[4744]: I0930 03:40:59.500779 4744 generic.go:334] "Generic (PLEG): container finished" podID="cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" containerID="5205f42bd5ec51a3c39af60a86f4554a4376f5594b6959c33bea5f8ba21595d9" exitCode=0 Sep 30 03:40:59 crc kubenswrapper[4744]: I0930 03:40:59.500873 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" event={"ID":"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c","Type":"ContainerDied","Data":"5205f42bd5ec51a3c39af60a86f4554a4376f5594b6959c33bea5f8ba21595d9"} Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.140609 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.281533 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-0\") pod \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.281688 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn7cn\" (UniqueName: \"kubernetes.io/projected/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-kube-api-access-nn7cn\") pod \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.281782 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-inventory\") pod \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.281838 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ssh-key\") pod \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.281875 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-1\") pod \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.281965 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-telemetry-combined-ca-bundle\") pod \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.282012 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-2\") pod \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\" (UID: \"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c\") " Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.287932 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-kube-api-access-nn7cn" (OuterVolumeSpecName: "kube-api-access-nn7cn") pod "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" (UID: "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c"). InnerVolumeSpecName "kube-api-access-nn7cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.290537 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" (UID: "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.319713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-inventory" (OuterVolumeSpecName: "inventory") pod "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" (UID: "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.331175 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" (UID: "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.338941 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" (UID: "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.340014 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" (UID: "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.351157 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" (UID: "cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.385250 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.385295 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn7cn\" (UniqueName: \"kubernetes.io/projected/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-kube-api-access-nn7cn\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.385314 4744 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.385331 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.385348 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.385389 4744 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.385407 4744 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.525455 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" event={"ID":"cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c","Type":"ContainerDied","Data":"f8601574d8c29dc8df6425bbfcb7415a4e058f9bd99fa4db52245bcdd54bf1dd"} Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.525534 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8601574d8c29dc8df6425bbfcb7415a4e058f9bd99fa4db52245bcdd54bf1dd" Sep 30 03:41:01 crc kubenswrapper[4744]: I0930 03:41:01.525557 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rkxts" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.808234 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8482f"] Sep 30 03:41:19 crc kubenswrapper[4744]: E0930 03:41:19.809195 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="extract-content" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.809212 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="extract-content" Sep 30 03:41:19 crc kubenswrapper[4744]: E0930 03:41:19.809225 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.809234 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 03:41:19 crc kubenswrapper[4744]: E0930 03:41:19.809267 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="extract-utilities" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.809277 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="extract-utilities" Sep 30 03:41:19 crc kubenswrapper[4744]: E0930 03:41:19.809302 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="registry-server" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.809309 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="registry-server" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.810069 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="da315cdf-b092-43e9-a1e1-71de3d0c3282" containerName="registry-server" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.810114 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.813436 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.829942 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8482f"] Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.853207 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk965\" (UniqueName: \"kubernetes.io/projected/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-kube-api-access-qk965\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.853412 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-catalog-content\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.853450 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-utilities\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.954901 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk965\" (UniqueName: \"kubernetes.io/projected/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-kube-api-access-qk965\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.955062 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-catalog-content\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.955086 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-utilities\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.955802 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-catalog-content\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.955811 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-utilities\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:19 crc kubenswrapper[4744]: I0930 03:41:19.973949 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk965\" (UniqueName: \"kubernetes.io/projected/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-kube-api-access-qk965\") pod \"community-operators-8482f\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:20 crc kubenswrapper[4744]: I0930 03:41:20.140045 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:20 crc kubenswrapper[4744]: I0930 03:41:20.702653 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8482f"] Sep 30 03:41:20 crc kubenswrapper[4744]: I0930 03:41:20.761878 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8482f" event={"ID":"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9","Type":"ContainerStarted","Data":"4fba8dbae00630261fcff8e2adc4605bbeba10bec1b878c8d96a052d2703e977"} Sep 30 03:41:21 crc kubenswrapper[4744]: I0930 03:41:21.777771 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerID="0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8" exitCode=0 Sep 30 03:41:21 crc kubenswrapper[4744]: I0930 03:41:21.777863 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8482f" event={"ID":"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9","Type":"ContainerDied","Data":"0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8"} Sep 30 03:41:23 crc kubenswrapper[4744]: I0930 03:41:23.799156 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerID="89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b" exitCode=0 Sep 30 03:41:23 crc kubenswrapper[4744]: I0930 03:41:23.799297 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8482f" event={"ID":"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9","Type":"ContainerDied","Data":"89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b"} Sep 30 03:41:24 crc kubenswrapper[4744]: I0930 03:41:24.811774 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8482f" event={"ID":"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9","Type":"ContainerStarted","Data":"8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1"} Sep 30 03:41:24 crc kubenswrapper[4744]: I0930 03:41:24.844442 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8482f" podStartSLOduration=3.443652862 podStartE2EDuration="5.844418115s" podCreationTimestamp="2025-09-30 03:41:19 +0000 UTC" firstStartedPulling="2025-09-30 03:41:21.781639715 +0000 UTC m=+2808.954859689" lastFinishedPulling="2025-09-30 03:41:24.182404928 +0000 UTC m=+2811.355624942" observedRunningTime="2025-09-30 03:41:24.84007309 +0000 UTC m=+2812.013293104" watchObservedRunningTime="2025-09-30 03:41:24.844418115 +0000 UTC m=+2812.017638109" Sep 30 03:41:30 crc kubenswrapper[4744]: I0930 03:41:30.140291 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:30 crc kubenswrapper[4744]: I0930 03:41:30.141241 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:30 crc kubenswrapper[4744]: I0930 03:41:30.231054 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:30 crc kubenswrapper[4744]: I0930 03:41:30.958047 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:31 crc kubenswrapper[4744]: I0930 03:41:31.018837 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8482f"] Sep 30 03:41:32 crc kubenswrapper[4744]: I0930 03:41:32.893987 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8482f" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="registry-server" containerID="cri-o://8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1" gracePeriod=2 Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.380919 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.550708 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-utilities\") pod \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.550801 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk965\" (UniqueName: \"kubernetes.io/projected/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-kube-api-access-qk965\") pod \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.550862 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-catalog-content\") pod \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\" (UID: \"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9\") " Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.552125 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-utilities" (OuterVolumeSpecName: "utilities") pod "f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" (UID: "f4b01cb2-9b43-434e-a0c3-bb5f513f29e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.558744 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-kube-api-access-qk965" (OuterVolumeSpecName: "kube-api-access-qk965") pod "f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" (UID: "f4b01cb2-9b43-434e-a0c3-bb5f513f29e9"). InnerVolumeSpecName "kube-api-access-qk965". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.647122 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" (UID: "f4b01cb2-9b43-434e-a0c3-bb5f513f29e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.654583 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.654639 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk965\" (UniqueName: \"kubernetes.io/projected/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-kube-api-access-qk965\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.654681 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.911979 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerID="8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1" exitCode=0 Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.912019 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8482f" event={"ID":"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9","Type":"ContainerDied","Data":"8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1"} Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.912046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8482f" event={"ID":"f4b01cb2-9b43-434e-a0c3-bb5f513f29e9","Type":"ContainerDied","Data":"4fba8dbae00630261fcff8e2adc4605bbeba10bec1b878c8d96a052d2703e977"} Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.912065 4744 scope.go:117] "RemoveContainer" containerID="8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.912070 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8482f" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.979044 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8482f"] Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.979103 4744 scope.go:117] "RemoveContainer" containerID="89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b" Sep 30 03:41:33 crc kubenswrapper[4744]: I0930 03:41:33.988004 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8482f"] Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.009670 4744 scope.go:117] "RemoveContainer" containerID="0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8" Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.065422 4744 scope.go:117] "RemoveContainer" containerID="8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1" Sep 30 03:41:34 crc kubenswrapper[4744]: E0930 03:41:34.066025 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1\": container with ID starting with 8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1 not found: ID does not exist" containerID="8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1" Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.066089 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1"} err="failed to get container status \"8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1\": rpc error: code = NotFound desc = could not find container \"8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1\": container with ID starting with 8c6a921ab95b4d69d13dd647c14a83e9645266fc54f38f33c57aee90cc65dea1 not found: ID does not exist" Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.066139 4744 scope.go:117] "RemoveContainer" containerID="89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b" Sep 30 03:41:34 crc kubenswrapper[4744]: E0930 03:41:34.066622 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b\": container with ID starting with 89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b not found: ID does not exist" containerID="89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b" Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.066680 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b"} err="failed to get container status \"89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b\": rpc error: code = NotFound desc = could not find container \"89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b\": container with ID starting with 89f86644eff407a38e95fd4617ee2d79090c1f5d5f6f3286fff5541c4037973b not found: ID does not exist" Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.066757 4744 scope.go:117] "RemoveContainer" containerID="0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8" Sep 30 03:41:34 crc kubenswrapper[4744]: E0930 03:41:34.067142 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8\": container with ID starting with 0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8 not found: ID does not exist" containerID="0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8" Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.067185 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8"} err="failed to get container status \"0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8\": rpc error: code = NotFound desc = could not find container \"0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8\": container with ID starting with 0c78105763f4b3620e6fc4383160023df6b375c5b21992e307d88b3bb1ed0cd8 not found: ID does not exist" Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.347657 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:41:34 crc kubenswrapper[4744]: I0930 03:41:34.348421 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:41:35 crc kubenswrapper[4744]: I0930 03:41:35.533646 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" path="/var/lib/kubelet/pods/f4b01cb2-9b43-434e-a0c3-bb5f513f29e9/volumes" Sep 30 03:42:04 crc kubenswrapper[4744]: I0930 03:42:04.347781 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:42:04 crc kubenswrapper[4744]: I0930 03:42:04.348459 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.362295 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 03:42:09 crc kubenswrapper[4744]: E0930 03:42:09.366730 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="extract-content" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.366980 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="extract-content" Sep 30 03:42:09 crc kubenswrapper[4744]: E0930 03:42:09.367115 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="registry-server" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.367264 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="registry-server" Sep 30 03:42:09 crc kubenswrapper[4744]: E0930 03:42:09.367417 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="extract-utilities" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.367578 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="extract-utilities" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.368205 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b01cb2-9b43-434e-a0c3-bb5f513f29e9" containerName="registry-server" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.369856 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.375202 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.376047 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.376423 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.377977 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.531716 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.531924 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.532066 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.532206 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.532340 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.532534 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-config-data\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.532614 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4j74\" (UniqueName: \"kubernetes.io/projected/f4a78f7a-b5bc-4636-81df-578f5105bce3-kube-api-access-q4j74\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.532750 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.532779 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.635224 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.635329 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-config-data\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.635358 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4j74\" (UniqueName: \"kubernetes.io/projected/f4a78f7a-b5bc-4636-81df-578f5105bce3-kube-api-access-q4j74\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.635456 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.635477 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.635844 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.635941 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.636227 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.636305 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.636344 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.636452 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.637154 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.637536 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-config-data\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.638453 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.644783 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.647550 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.663414 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.664622 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4j74\" (UniqueName: \"kubernetes.io/projected/f4a78f7a-b5bc-4636-81df-578f5105bce3-kube-api-access-q4j74\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.680893 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " pod="openstack/tempest-tests-tempest" Sep 30 03:42:09 crc kubenswrapper[4744]: I0930 03:42:09.712126 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 03:42:10 crc kubenswrapper[4744]: I0930 03:42:10.292120 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 03:42:10 crc kubenswrapper[4744]: I0930 03:42:10.324046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f4a78f7a-b5bc-4636-81df-578f5105bce3","Type":"ContainerStarted","Data":"c04e72a9d85adccd91c461b58cf2314b2ee6e7522adf0aa0f869e42eaa1d2f28"} Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.348324 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.348858 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.348927 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.349720 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.349774 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" gracePeriod=600 Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.552757 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" exitCode=0 Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.552840 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194"} Sep 30 03:42:34 crc kubenswrapper[4744]: I0930 03:42:34.553130 4744 scope.go:117] "RemoveContainer" containerID="dc88f8e7a76aaeb4ef100e7ac49085589ff7268236e04a20eee97f324ef466a8" Sep 30 03:42:38 crc kubenswrapper[4744]: E0930 03:42:38.619906 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:42:38 crc kubenswrapper[4744]: E0930 03:42:38.801267 4744 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Sep 30 03:42:38 crc kubenswrapper[4744]: E0930 03:42:38.801682 4744 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q4j74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(f4a78f7a-b5bc-4636-81df-578f5105bce3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 03:42:38 crc kubenswrapper[4744]: E0930 03:42:38.803066 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="f4a78f7a-b5bc-4636-81df-578f5105bce3" Sep 30 03:42:39 crc kubenswrapper[4744]: I0930 03:42:39.608103 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:42:39 crc kubenswrapper[4744]: E0930 03:42:39.608494 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:42:39 crc kubenswrapper[4744]: E0930 03:42:39.609342 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="f4a78f7a-b5bc-4636-81df-578f5105bce3" Sep 30 03:42:50 crc kubenswrapper[4744]: I0930 03:42:50.503525 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:42:50 crc kubenswrapper[4744]: E0930 03:42:50.504569 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:42:55 crc kubenswrapper[4744]: I0930 03:42:55.027248 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 03:42:57 crc kubenswrapper[4744]: I0930 03:42:57.801874 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f4a78f7a-b5bc-4636-81df-578f5105bce3","Type":"ContainerStarted","Data":"9d6e78a3adde425ff4cc4f71705b23a45577f722325d56f667bde6d003694c72"} Sep 30 03:42:57 crc kubenswrapper[4744]: I0930 03:42:57.833142 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.119976819 podStartE2EDuration="49.833116264s" podCreationTimestamp="2025-09-30 03:42:08 +0000 UTC" firstStartedPulling="2025-09-30 03:42:10.307577192 +0000 UTC m=+2857.480797156" lastFinishedPulling="2025-09-30 03:42:55.020716617 +0000 UTC m=+2902.193936601" observedRunningTime="2025-09-30 03:42:57.821047189 +0000 UTC m=+2904.994267173" watchObservedRunningTime="2025-09-30 03:42:57.833116264 +0000 UTC m=+2905.006336238" Sep 30 03:43:04 crc kubenswrapper[4744]: I0930 03:43:04.503673 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:43:04 crc kubenswrapper[4744]: E0930 03:43:04.504454 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:43:15 crc kubenswrapper[4744]: I0930 03:43:15.503565 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:43:15 crc kubenswrapper[4744]: E0930 03:43:15.504520 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:43:26 crc kubenswrapper[4744]: I0930 03:43:26.503883 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:43:26 crc kubenswrapper[4744]: E0930 03:43:26.504749 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:43:37 crc kubenswrapper[4744]: I0930 03:43:37.503744 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:43:37 crc kubenswrapper[4744]: E0930 03:43:37.504681 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:43:51 crc kubenswrapper[4744]: I0930 03:43:51.503791 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:43:51 crc kubenswrapper[4744]: E0930 03:43:51.504412 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:44:02 crc kubenswrapper[4744]: I0930 03:44:02.504130 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:44:02 crc kubenswrapper[4744]: E0930 03:44:02.506152 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:44:14 crc kubenswrapper[4744]: I0930 03:44:14.503708 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:44:14 crc kubenswrapper[4744]: E0930 03:44:14.504245 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:44:28 crc kubenswrapper[4744]: I0930 03:44:28.503935 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:44:28 crc kubenswrapper[4744]: E0930 03:44:28.505214 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:44:40 crc kubenswrapper[4744]: I0930 03:44:40.504154 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:44:40 crc kubenswrapper[4744]: E0930 03:44:40.504993 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:44:53 crc kubenswrapper[4744]: I0930 03:44:53.512328 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:44:53 crc kubenswrapper[4744]: E0930 03:44:53.513334 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.199320 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg"] Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.202967 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.206196 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.207003 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.212894 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg"] Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.340181 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f9e2978-a3a1-4545-9e15-47bb36d54b25-secret-volume\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.340282 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqdtg\" (UniqueName: \"kubernetes.io/projected/2f9e2978-a3a1-4545-9e15-47bb36d54b25-kube-api-access-cqdtg\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.340441 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f9e2978-a3a1-4545-9e15-47bb36d54b25-config-volume\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.442963 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f9e2978-a3a1-4545-9e15-47bb36d54b25-config-volume\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.443103 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f9e2978-a3a1-4545-9e15-47bb36d54b25-secret-volume\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.443255 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqdtg\" (UniqueName: \"kubernetes.io/projected/2f9e2978-a3a1-4545-9e15-47bb36d54b25-kube-api-access-cqdtg\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.444920 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f9e2978-a3a1-4545-9e15-47bb36d54b25-config-volume\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.452115 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f9e2978-a3a1-4545-9e15-47bb36d54b25-secret-volume\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.466835 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqdtg\" (UniqueName: \"kubernetes.io/projected/2f9e2978-a3a1-4545-9e15-47bb36d54b25-kube-api-access-cqdtg\") pod \"collect-profiles-29320065-vm9mg\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:00 crc kubenswrapper[4744]: I0930 03:45:00.539932 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:01 crc kubenswrapper[4744]: I0930 03:45:01.019161 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg"] Sep 30 03:45:01 crc kubenswrapper[4744]: I0930 03:45:01.086338 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" event={"ID":"2f9e2978-a3a1-4545-9e15-47bb36d54b25","Type":"ContainerStarted","Data":"81fed6ab1e595ce4285e519a8fa078aae87e4036e05ae079b856b4e9a137cd0d"} Sep 30 03:45:02 crc kubenswrapper[4744]: I0930 03:45:02.096763 4744 generic.go:334] "Generic (PLEG): container finished" podID="2f9e2978-a3a1-4545-9e15-47bb36d54b25" containerID="2748e8b8124ff1700e64b2613f1d046e63ca719d72503d6581a124cba97faa6e" exitCode=0 Sep 30 03:45:02 crc kubenswrapper[4744]: I0930 03:45:02.097736 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" event={"ID":"2f9e2978-a3a1-4545-9e15-47bb36d54b25","Type":"ContainerDied","Data":"2748e8b8124ff1700e64b2613f1d046e63ca719d72503d6581a124cba97faa6e"} Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.751875 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.919937 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f9e2978-a3a1-4545-9e15-47bb36d54b25-secret-volume\") pod \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.920108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f9e2978-a3a1-4545-9e15-47bb36d54b25-config-volume\") pod \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.921427 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqdtg\" (UniqueName: \"kubernetes.io/projected/2f9e2978-a3a1-4545-9e15-47bb36d54b25-kube-api-access-cqdtg\") pod \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\" (UID: \"2f9e2978-a3a1-4545-9e15-47bb36d54b25\") " Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.921510 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f9e2978-a3a1-4545-9e15-47bb36d54b25-config-volume" (OuterVolumeSpecName: "config-volume") pod "2f9e2978-a3a1-4545-9e15-47bb36d54b25" (UID: "2f9e2978-a3a1-4545-9e15-47bb36d54b25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.922255 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f9e2978-a3a1-4545-9e15-47bb36d54b25-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.929842 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f9e2978-a3a1-4545-9e15-47bb36d54b25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2f9e2978-a3a1-4545-9e15-47bb36d54b25" (UID: "2f9e2978-a3a1-4545-9e15-47bb36d54b25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 03:45:03 crc kubenswrapper[4744]: I0930 03:45:03.947060 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f9e2978-a3a1-4545-9e15-47bb36d54b25-kube-api-access-cqdtg" (OuterVolumeSpecName: "kube-api-access-cqdtg") pod "2f9e2978-a3a1-4545-9e15-47bb36d54b25" (UID: "2f9e2978-a3a1-4545-9e15-47bb36d54b25"). InnerVolumeSpecName "kube-api-access-cqdtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:45:04 crc kubenswrapper[4744]: I0930 03:45:04.024434 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f9e2978-a3a1-4545-9e15-47bb36d54b25-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 03:45:04 crc kubenswrapper[4744]: I0930 03:45:04.024473 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqdtg\" (UniqueName: \"kubernetes.io/projected/2f9e2978-a3a1-4545-9e15-47bb36d54b25-kube-api-access-cqdtg\") on node \"crc\" DevicePath \"\"" Sep 30 03:45:04 crc kubenswrapper[4744]: I0930 03:45:04.114231 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" event={"ID":"2f9e2978-a3a1-4545-9e15-47bb36d54b25","Type":"ContainerDied","Data":"81fed6ab1e595ce4285e519a8fa078aae87e4036e05ae079b856b4e9a137cd0d"} Sep 30 03:45:04 crc kubenswrapper[4744]: I0930 03:45:04.114267 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81fed6ab1e595ce4285e519a8fa078aae87e4036e05ae079b856b4e9a137cd0d" Sep 30 03:45:04 crc kubenswrapper[4744]: I0930 03:45:04.114280 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg" Sep 30 03:45:04 crc kubenswrapper[4744]: I0930 03:45:04.829208 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj"] Sep 30 03:45:04 crc kubenswrapper[4744]: I0930 03:45:04.835027 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320020-6zrcj"] Sep 30 03:45:05 crc kubenswrapper[4744]: I0930 03:45:05.517328 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6745fc8a-f3db-42e5-b034-4b20a40fe2bf" path="/var/lib/kubelet/pods/6745fc8a-f3db-42e5-b034-4b20a40fe2bf/volumes" Sep 30 03:45:07 crc kubenswrapper[4744]: I0930 03:45:07.504841 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:45:07 crc kubenswrapper[4744]: E0930 03:45:07.505781 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:45:18 crc kubenswrapper[4744]: I0930 03:45:18.504134 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:45:18 crc kubenswrapper[4744]: E0930 03:45:18.505289 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:45:32 crc kubenswrapper[4744]: I0930 03:45:32.504023 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:45:32 crc kubenswrapper[4744]: E0930 03:45:32.505079 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:45:41 crc kubenswrapper[4744]: I0930 03:45:41.845677 4744 scope.go:117] "RemoveContainer" containerID="f6880137063365cd30d482582bc5023817815a8d74510201f7907af9635624c4" Sep 30 03:45:44 crc kubenswrapper[4744]: I0930 03:45:44.505244 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:45:44 crc kubenswrapper[4744]: E0930 03:45:44.506214 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:45:59 crc kubenswrapper[4744]: I0930 03:45:59.504042 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:45:59 crc kubenswrapper[4744]: E0930 03:45:59.504989 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:46:12 crc kubenswrapper[4744]: I0930 03:46:12.504540 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:46:12 crc kubenswrapper[4744]: E0930 03:46:12.505198 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:46:24 crc kubenswrapper[4744]: I0930 03:46:24.503555 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:46:24 crc kubenswrapper[4744]: E0930 03:46:24.504379 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.782591 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pmkrp"] Sep 30 03:46:29 crc kubenswrapper[4744]: E0930 03:46:29.783314 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9e2978-a3a1-4545-9e15-47bb36d54b25" containerName="collect-profiles" Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.783326 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9e2978-a3a1-4545-9e15-47bb36d54b25" containerName="collect-profiles" Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.783558 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f9e2978-a3a1-4545-9e15-47bb36d54b25" containerName="collect-profiles" Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.785036 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.805001 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmkrp"] Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.912588 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-utilities\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.912779 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-catalog-content\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:29 crc kubenswrapper[4744]: I0930 03:46:29.912852 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p72p9\" (UniqueName: \"kubernetes.io/projected/24d6a156-e519-4b2c-9007-231a9809cea6-kube-api-access-p72p9\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.015053 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-utilities\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.015191 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-catalog-content\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.015259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p72p9\" (UniqueName: \"kubernetes.io/projected/24d6a156-e519-4b2c-9007-231a9809cea6-kube-api-access-p72p9\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.015685 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-utilities\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.015992 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-catalog-content\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.046343 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p72p9\" (UniqueName: \"kubernetes.io/projected/24d6a156-e519-4b2c-9007-231a9809cea6-kube-api-access-p72p9\") pod \"redhat-marketplace-pmkrp\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.101992 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:30 crc kubenswrapper[4744]: I0930 03:46:30.666803 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmkrp"] Sep 30 03:46:31 crc kubenswrapper[4744]: I0930 03:46:31.014489 4744 generic.go:334] "Generic (PLEG): container finished" podID="24d6a156-e519-4b2c-9007-231a9809cea6" containerID="7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b" exitCode=0 Sep 30 03:46:31 crc kubenswrapper[4744]: I0930 03:46:31.014671 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmkrp" event={"ID":"24d6a156-e519-4b2c-9007-231a9809cea6","Type":"ContainerDied","Data":"7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b"} Sep 30 03:46:31 crc kubenswrapper[4744]: I0930 03:46:31.014728 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmkrp" event={"ID":"24d6a156-e519-4b2c-9007-231a9809cea6","Type":"ContainerStarted","Data":"5539d6708374adee74f6b7ba8283c662a19b5accc40e18a063f44918128b507c"} Sep 30 03:46:31 crc kubenswrapper[4744]: I0930 03:46:31.016458 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:46:32 crc kubenswrapper[4744]: I0930 03:46:32.025046 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmkrp" event={"ID":"24d6a156-e519-4b2c-9007-231a9809cea6","Type":"ContainerStarted","Data":"f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15"} Sep 30 03:46:33 crc kubenswrapper[4744]: I0930 03:46:33.044102 4744 generic.go:334] "Generic (PLEG): container finished" podID="24d6a156-e519-4b2c-9007-231a9809cea6" containerID="f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15" exitCode=0 Sep 30 03:46:33 crc kubenswrapper[4744]: I0930 03:46:33.044140 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmkrp" event={"ID":"24d6a156-e519-4b2c-9007-231a9809cea6","Type":"ContainerDied","Data":"f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15"} Sep 30 03:46:34 crc kubenswrapper[4744]: I0930 03:46:34.059985 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmkrp" event={"ID":"24d6a156-e519-4b2c-9007-231a9809cea6","Type":"ContainerStarted","Data":"573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7"} Sep 30 03:46:34 crc kubenswrapper[4744]: I0930 03:46:34.083462 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pmkrp" podStartSLOduration=2.552699865 podStartE2EDuration="5.083445549s" podCreationTimestamp="2025-09-30 03:46:29 +0000 UTC" firstStartedPulling="2025-09-30 03:46:31.016230739 +0000 UTC m=+3118.189450713" lastFinishedPulling="2025-09-30 03:46:33.546976413 +0000 UTC m=+3120.720196397" observedRunningTime="2025-09-30 03:46:34.082197561 +0000 UTC m=+3121.255417555" watchObservedRunningTime="2025-09-30 03:46:34.083445549 +0000 UTC m=+3121.256665523" Sep 30 03:46:35 crc kubenswrapper[4744]: I0930 03:46:35.504361 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:46:35 crc kubenswrapper[4744]: E0930 03:46:35.505316 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:46:40 crc kubenswrapper[4744]: I0930 03:46:40.102998 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:40 crc kubenswrapper[4744]: I0930 03:46:40.103687 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:40 crc kubenswrapper[4744]: I0930 03:46:40.183716 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:40 crc kubenswrapper[4744]: I0930 03:46:40.264166 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:40 crc kubenswrapper[4744]: I0930 03:46:40.425535 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmkrp"] Sep 30 03:46:41 crc kubenswrapper[4744]: I0930 03:46:41.942180 4744 scope.go:117] "RemoveContainer" containerID="c7f0ab35cac7965f7ff0babaea506860f52772b5124819e4ab1690c4706bd7cb" Sep 30 03:46:41 crc kubenswrapper[4744]: I0930 03:46:41.976438 4744 scope.go:117] "RemoveContainer" containerID="abf22e550945cc7080a593be9cbe2912d19724837328181606364f7e02dd344f" Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.010171 4744 scope.go:117] "RemoveContainer" containerID="add2915072424feddb6e1f9fef5dfef186d1c20c5b7fdf48187076a2daeb803d" Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.133317 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pmkrp" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="registry-server" containerID="cri-o://573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7" gracePeriod=2 Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.894205 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.978574 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p72p9\" (UniqueName: \"kubernetes.io/projected/24d6a156-e519-4b2c-9007-231a9809cea6-kube-api-access-p72p9\") pod \"24d6a156-e519-4b2c-9007-231a9809cea6\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.978829 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-utilities\") pod \"24d6a156-e519-4b2c-9007-231a9809cea6\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.978886 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-catalog-content\") pod \"24d6a156-e519-4b2c-9007-231a9809cea6\" (UID: \"24d6a156-e519-4b2c-9007-231a9809cea6\") " Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.979806 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-utilities" (OuterVolumeSpecName: "utilities") pod "24d6a156-e519-4b2c-9007-231a9809cea6" (UID: "24d6a156-e519-4b2c-9007-231a9809cea6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.981773 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.985484 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d6a156-e519-4b2c-9007-231a9809cea6-kube-api-access-p72p9" (OuterVolumeSpecName: "kube-api-access-p72p9") pod "24d6a156-e519-4b2c-9007-231a9809cea6" (UID: "24d6a156-e519-4b2c-9007-231a9809cea6"). InnerVolumeSpecName "kube-api-access-p72p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:46:42 crc kubenswrapper[4744]: I0930 03:46:42.997949 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24d6a156-e519-4b2c-9007-231a9809cea6" (UID: "24d6a156-e519-4b2c-9007-231a9809cea6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.083533 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p72p9\" (UniqueName: \"kubernetes.io/projected/24d6a156-e519-4b2c-9007-231a9809cea6-kube-api-access-p72p9\") on node \"crc\" DevicePath \"\"" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.083572 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24d6a156-e519-4b2c-9007-231a9809cea6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.150287 4744 generic.go:334] "Generic (PLEG): container finished" podID="24d6a156-e519-4b2c-9007-231a9809cea6" containerID="573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7" exitCode=0 Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.150350 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmkrp" event={"ID":"24d6a156-e519-4b2c-9007-231a9809cea6","Type":"ContainerDied","Data":"573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7"} Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.150346 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pmkrp" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.150422 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pmkrp" event={"ID":"24d6a156-e519-4b2c-9007-231a9809cea6","Type":"ContainerDied","Data":"5539d6708374adee74f6b7ba8283c662a19b5accc40e18a063f44918128b507c"} Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.150453 4744 scope.go:117] "RemoveContainer" containerID="573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.217485 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmkrp"] Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.227857 4744 scope.go:117] "RemoveContainer" containerID="f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.239543 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pmkrp"] Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.269906 4744 scope.go:117] "RemoveContainer" containerID="7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.298014 4744 scope.go:117] "RemoveContainer" containerID="573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7" Sep 30 03:46:43 crc kubenswrapper[4744]: E0930 03:46:43.298655 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7\": container with ID starting with 573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7 not found: ID does not exist" containerID="573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.298742 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7"} err="failed to get container status \"573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7\": rpc error: code = NotFound desc = could not find container \"573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7\": container with ID starting with 573c95a9ca33cd9b708c89246aaf1a9b9e55092b02b83766c6e9506d29c79eb7 not found: ID does not exist" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.298800 4744 scope.go:117] "RemoveContainer" containerID="f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15" Sep 30 03:46:43 crc kubenswrapper[4744]: E0930 03:46:43.299210 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15\": container with ID starting with f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15 not found: ID does not exist" containerID="f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.299280 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15"} err="failed to get container status \"f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15\": rpc error: code = NotFound desc = could not find container \"f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15\": container with ID starting with f5a3f1e5a0ba36f8b50d74d9f68317a80d83c53b83bad7cf1e9d43cda6e15f15 not found: ID does not exist" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.299321 4744 scope.go:117] "RemoveContainer" containerID="7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b" Sep 30 03:46:43 crc kubenswrapper[4744]: E0930 03:46:43.299870 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b\": container with ID starting with 7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b not found: ID does not exist" containerID="7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.299972 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b"} err="failed to get container status \"7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b\": rpc error: code = NotFound desc = could not find container \"7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b\": container with ID starting with 7098afae10da410117baeee303bd58e32feca86988e4bc90b3dd219802b8bb6b not found: ID does not exist" Sep 30 03:46:43 crc kubenswrapper[4744]: I0930 03:46:43.523181 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" path="/var/lib/kubelet/pods/24d6a156-e519-4b2c-9007-231a9809cea6/volumes" Sep 30 03:46:47 crc kubenswrapper[4744]: I0930 03:46:47.503863 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:46:47 crc kubenswrapper[4744]: E0930 03:46:47.504814 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:47:00 crc kubenswrapper[4744]: I0930 03:47:00.503800 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:47:00 crc kubenswrapper[4744]: E0930 03:47:00.504838 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:47:14 crc kubenswrapper[4744]: I0930 03:47:14.503584 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:47:14 crc kubenswrapper[4744]: E0930 03:47:14.504388 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:47:26 crc kubenswrapper[4744]: I0930 03:47:26.504250 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:47:26 crc kubenswrapper[4744]: E0930 03:47:26.505118 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:47:41 crc kubenswrapper[4744]: I0930 03:47:41.504063 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:47:42 crc kubenswrapper[4744]: I0930 03:47:42.677389 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"99fe49680c1b6aa39bba2bdcee84bc111f502657f8907c2575eed1d9238b45f4"} Sep 30 03:50:04 crc kubenswrapper[4744]: I0930 03:50:04.348241 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:50:04 crc kubenswrapper[4744]: I0930 03:50:04.349083 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:50:22 crc kubenswrapper[4744]: I0930 03:50:22.841031 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7w8lm"] Sep 30 03:50:22 crc kubenswrapper[4744]: E0930 03:50:22.842171 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="extract-utilities" Sep 30 03:50:22 crc kubenswrapper[4744]: I0930 03:50:22.842192 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="extract-utilities" Sep 30 03:50:22 crc kubenswrapper[4744]: E0930 03:50:22.842222 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="extract-content" Sep 30 03:50:22 crc kubenswrapper[4744]: I0930 03:50:22.842232 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="extract-content" Sep 30 03:50:22 crc kubenswrapper[4744]: E0930 03:50:22.842249 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="registry-server" Sep 30 03:50:22 crc kubenswrapper[4744]: I0930 03:50:22.842258 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="registry-server" Sep 30 03:50:22 crc kubenswrapper[4744]: I0930 03:50:22.842548 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d6a156-e519-4b2c-9007-231a9809cea6" containerName="registry-server" Sep 30 03:50:22 crc kubenswrapper[4744]: I0930 03:50:22.845649 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:22 crc kubenswrapper[4744]: I0930 03:50:22.856060 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7w8lm"] Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.001923 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxbsv\" (UniqueName: \"kubernetes.io/projected/c33c51d7-f47e-41ca-9a3e-7093542f1232-kube-api-access-vxbsv\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.002010 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-utilities\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.002049 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-catalog-content\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.104485 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxbsv\" (UniqueName: \"kubernetes.io/projected/c33c51d7-f47e-41ca-9a3e-7093542f1232-kube-api-access-vxbsv\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.104571 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-utilities\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.104623 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-catalog-content\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.105167 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-catalog-content\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.106000 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-utilities\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.125134 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxbsv\" (UniqueName: \"kubernetes.io/projected/c33c51d7-f47e-41ca-9a3e-7093542f1232-kube-api-access-vxbsv\") pod \"certified-operators-7w8lm\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.164328 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:23 crc kubenswrapper[4744]: I0930 03:50:23.651634 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7w8lm"] Sep 30 03:50:24 crc kubenswrapper[4744]: I0930 03:50:24.269041 4744 generic.go:334] "Generic (PLEG): container finished" podID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerID="6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f" exitCode=0 Sep 30 03:50:24 crc kubenswrapper[4744]: I0930 03:50:24.269143 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w8lm" event={"ID":"c33c51d7-f47e-41ca-9a3e-7093542f1232","Type":"ContainerDied","Data":"6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f"} Sep 30 03:50:24 crc kubenswrapper[4744]: I0930 03:50:24.269447 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w8lm" event={"ID":"c33c51d7-f47e-41ca-9a3e-7093542f1232","Type":"ContainerStarted","Data":"6c1fdb3b71186a1ccc934fd19a12363d572d8f15c58b5ff763f29b628a9b0ec2"} Sep 30 03:50:25 crc kubenswrapper[4744]: I0930 03:50:25.283268 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w8lm" event={"ID":"c33c51d7-f47e-41ca-9a3e-7093542f1232","Type":"ContainerStarted","Data":"d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f"} Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.303338 4744 generic.go:334] "Generic (PLEG): container finished" podID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerID="d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f" exitCode=0 Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.303407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w8lm" event={"ID":"c33c51d7-f47e-41ca-9a3e-7093542f1232","Type":"ContainerDied","Data":"d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f"} Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.639190 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbhx6"] Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.641208 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.649047 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbhx6"] Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.803961 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-catalog-content\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.804154 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzjb2\" (UniqueName: \"kubernetes.io/projected/c82b81d6-f114-448b-b0c0-06ab7294d38e-kube-api-access-jzjb2\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.804176 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-utilities\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.905672 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzjb2\" (UniqueName: \"kubernetes.io/projected/c82b81d6-f114-448b-b0c0-06ab7294d38e-kube-api-access-jzjb2\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.905708 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-utilities\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.905744 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-catalog-content\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.906153 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-catalog-content\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.906352 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-utilities\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.939571 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzjb2\" (UniqueName: \"kubernetes.io/projected/c82b81d6-f114-448b-b0c0-06ab7294d38e-kube-api-access-jzjb2\") pod \"redhat-operators-pbhx6\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:27 crc kubenswrapper[4744]: I0930 03:50:27.964606 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:28 crc kubenswrapper[4744]: I0930 03:50:28.326870 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w8lm" event={"ID":"c33c51d7-f47e-41ca-9a3e-7093542f1232","Type":"ContainerStarted","Data":"9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8"} Sep 30 03:50:28 crc kubenswrapper[4744]: I0930 03:50:28.367046 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7w8lm" podStartSLOduration=2.8976106489999998 podStartE2EDuration="6.367029735s" podCreationTimestamp="2025-09-30 03:50:22 +0000 UTC" firstStartedPulling="2025-09-30 03:50:24.272326066 +0000 UTC m=+3351.445546040" lastFinishedPulling="2025-09-30 03:50:27.741745142 +0000 UTC m=+3354.914965126" observedRunningTime="2025-09-30 03:50:28.361480873 +0000 UTC m=+3355.534700847" watchObservedRunningTime="2025-09-30 03:50:28.367029735 +0000 UTC m=+3355.540249709" Sep 30 03:50:28 crc kubenswrapper[4744]: I0930 03:50:28.527520 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbhx6"] Sep 30 03:50:28 crc kubenswrapper[4744]: W0930 03:50:28.533339 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc82b81d6_f114_448b_b0c0_06ab7294d38e.slice/crio-a7bb5bf932f0d258e05c793c3f0b2f455875e31fe7e2d2783c7a9def9ba570d1 WatchSource:0}: Error finding container a7bb5bf932f0d258e05c793c3f0b2f455875e31fe7e2d2783c7a9def9ba570d1: Status 404 returned error can't find the container with id a7bb5bf932f0d258e05c793c3f0b2f455875e31fe7e2d2783c7a9def9ba570d1 Sep 30 03:50:29 crc kubenswrapper[4744]: I0930 03:50:29.336110 4744 generic.go:334] "Generic (PLEG): container finished" podID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerID="03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7" exitCode=0 Sep 30 03:50:29 crc kubenswrapper[4744]: I0930 03:50:29.336216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbhx6" event={"ID":"c82b81d6-f114-448b-b0c0-06ab7294d38e","Type":"ContainerDied","Data":"03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7"} Sep 30 03:50:29 crc kubenswrapper[4744]: I0930 03:50:29.336579 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbhx6" event={"ID":"c82b81d6-f114-448b-b0c0-06ab7294d38e","Type":"ContainerStarted","Data":"a7bb5bf932f0d258e05c793c3f0b2f455875e31fe7e2d2783c7a9def9ba570d1"} Sep 30 03:50:30 crc kubenswrapper[4744]: I0930 03:50:30.348622 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbhx6" event={"ID":"c82b81d6-f114-448b-b0c0-06ab7294d38e","Type":"ContainerStarted","Data":"21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6"} Sep 30 03:50:33 crc kubenswrapper[4744]: I0930 03:50:33.165053 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:33 crc kubenswrapper[4744]: I0930 03:50:33.166408 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:34 crc kubenswrapper[4744]: I0930 03:50:34.242904 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7w8lm" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="registry-server" probeResult="failure" output=< Sep 30 03:50:34 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 03:50:34 crc kubenswrapper[4744]: > Sep 30 03:50:34 crc kubenswrapper[4744]: I0930 03:50:34.348003 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:50:34 crc kubenswrapper[4744]: I0930 03:50:34.348268 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:50:36 crc kubenswrapper[4744]: I0930 03:50:36.416262 4744 generic.go:334] "Generic (PLEG): container finished" podID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerID="21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6" exitCode=0 Sep 30 03:50:36 crc kubenswrapper[4744]: I0930 03:50:36.416448 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbhx6" event={"ID":"c82b81d6-f114-448b-b0c0-06ab7294d38e","Type":"ContainerDied","Data":"21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6"} Sep 30 03:50:37 crc kubenswrapper[4744]: I0930 03:50:37.430103 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbhx6" event={"ID":"c82b81d6-f114-448b-b0c0-06ab7294d38e","Type":"ContainerStarted","Data":"7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762"} Sep 30 03:50:37 crc kubenswrapper[4744]: I0930 03:50:37.453049 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbhx6" podStartSLOduration=2.937347109 podStartE2EDuration="10.453028395s" podCreationTimestamp="2025-09-30 03:50:27 +0000 UTC" firstStartedPulling="2025-09-30 03:50:29.339739578 +0000 UTC m=+3356.512959562" lastFinishedPulling="2025-09-30 03:50:36.855420874 +0000 UTC m=+3364.028640848" observedRunningTime="2025-09-30 03:50:37.447776301 +0000 UTC m=+3364.620996295" watchObservedRunningTime="2025-09-30 03:50:37.453028395 +0000 UTC m=+3364.626248389" Sep 30 03:50:37 crc kubenswrapper[4744]: I0930 03:50:37.965478 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:37 crc kubenswrapper[4744]: I0930 03:50:37.965542 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:39 crc kubenswrapper[4744]: I0930 03:50:39.055694 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pbhx6" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="registry-server" probeResult="failure" output=< Sep 30 03:50:39 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 03:50:39 crc kubenswrapper[4744]: > Sep 30 03:50:43 crc kubenswrapper[4744]: I0930 03:50:43.216085 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:43 crc kubenswrapper[4744]: I0930 03:50:43.265114 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:43 crc kubenswrapper[4744]: I0930 03:50:43.454941 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7w8lm"] Sep 30 03:50:44 crc kubenswrapper[4744]: I0930 03:50:44.492283 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7w8lm" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="registry-server" containerID="cri-o://9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8" gracePeriod=2 Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.329687 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.453825 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-utilities\") pod \"c33c51d7-f47e-41ca-9a3e-7093542f1232\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.453858 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-catalog-content\") pod \"c33c51d7-f47e-41ca-9a3e-7093542f1232\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.453941 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxbsv\" (UniqueName: \"kubernetes.io/projected/c33c51d7-f47e-41ca-9a3e-7093542f1232-kube-api-access-vxbsv\") pod \"c33c51d7-f47e-41ca-9a3e-7093542f1232\" (UID: \"c33c51d7-f47e-41ca-9a3e-7093542f1232\") " Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.458108 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-utilities" (OuterVolumeSpecName: "utilities") pod "c33c51d7-f47e-41ca-9a3e-7093542f1232" (UID: "c33c51d7-f47e-41ca-9a3e-7093542f1232"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.470731 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33c51d7-f47e-41ca-9a3e-7093542f1232-kube-api-access-vxbsv" (OuterVolumeSpecName: "kube-api-access-vxbsv") pod "c33c51d7-f47e-41ca-9a3e-7093542f1232" (UID: "c33c51d7-f47e-41ca-9a3e-7093542f1232"). InnerVolumeSpecName "kube-api-access-vxbsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.508896 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c33c51d7-f47e-41ca-9a3e-7093542f1232" (UID: "c33c51d7-f47e-41ca-9a3e-7093542f1232"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.526282 4744 generic.go:334] "Generic (PLEG): container finished" podID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerID="9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8" exitCode=0 Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.526423 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w8lm" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.533755 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w8lm" event={"ID":"c33c51d7-f47e-41ca-9a3e-7093542f1232","Type":"ContainerDied","Data":"9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8"} Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.533794 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w8lm" event={"ID":"c33c51d7-f47e-41ca-9a3e-7093542f1232","Type":"ContainerDied","Data":"6c1fdb3b71186a1ccc934fd19a12363d572d8f15c58b5ff763f29b628a9b0ec2"} Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.533813 4744 scope.go:117] "RemoveContainer" containerID="9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.556070 4744 scope.go:117] "RemoveContainer" containerID="d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.563453 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.563495 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c33c51d7-f47e-41ca-9a3e-7093542f1232-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.563514 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxbsv\" (UniqueName: \"kubernetes.io/projected/c33c51d7-f47e-41ca-9a3e-7093542f1232-kube-api-access-vxbsv\") on node \"crc\" DevicePath \"\"" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.575995 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7w8lm"] Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.598145 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7w8lm"] Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.598155 4744 scope.go:117] "RemoveContainer" containerID="6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.625586 4744 scope.go:117] "RemoveContainer" containerID="9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8" Sep 30 03:50:45 crc kubenswrapper[4744]: E0930 03:50:45.626166 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8\": container with ID starting with 9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8 not found: ID does not exist" containerID="9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.626212 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8"} err="failed to get container status \"9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8\": rpc error: code = NotFound desc = could not find container \"9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8\": container with ID starting with 9d05c90c230a8f8fcd7af3b2555ecf679b91f6b556053af24d6a7601c2808cf8 not found: ID does not exist" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.626259 4744 scope.go:117] "RemoveContainer" containerID="d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f" Sep 30 03:50:45 crc kubenswrapper[4744]: E0930 03:50:45.627490 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f\": container with ID starting with d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f not found: ID does not exist" containerID="d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.627521 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f"} err="failed to get container status \"d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f\": rpc error: code = NotFound desc = could not find container \"d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f\": container with ID starting with d32387012c0739f806f0cc9c8f88aaeb90b16f79cfd27299db7b88722716b33f not found: ID does not exist" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.627537 4744 scope.go:117] "RemoveContainer" containerID="6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f" Sep 30 03:50:45 crc kubenswrapper[4744]: E0930 03:50:45.627838 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f\": container with ID starting with 6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f not found: ID does not exist" containerID="6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f" Sep 30 03:50:45 crc kubenswrapper[4744]: I0930 03:50:45.627864 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f"} err="failed to get container status \"6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f\": rpc error: code = NotFound desc = could not find container \"6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f\": container with ID starting with 6e0c84a29aaba09baed3564edf309216a66a7fef3866dad3fff1b79db32def1f not found: ID does not exist" Sep 30 03:50:47 crc kubenswrapper[4744]: I0930 03:50:47.513837 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" path="/var/lib/kubelet/pods/c33c51d7-f47e-41ca-9a3e-7093542f1232/volumes" Sep 30 03:50:48 crc kubenswrapper[4744]: I0930 03:50:48.050591 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:48 crc kubenswrapper[4744]: I0930 03:50:48.115425 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:48 crc kubenswrapper[4744]: I0930 03:50:48.853766 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbhx6"] Sep 30 03:50:49 crc kubenswrapper[4744]: I0930 03:50:49.566545 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbhx6" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="registry-server" containerID="cri-o://7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762" gracePeriod=2 Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.163446 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.270108 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-utilities\") pod \"c82b81d6-f114-448b-b0c0-06ab7294d38e\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.270580 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzjb2\" (UniqueName: \"kubernetes.io/projected/c82b81d6-f114-448b-b0c0-06ab7294d38e-kube-api-access-jzjb2\") pod \"c82b81d6-f114-448b-b0c0-06ab7294d38e\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.270643 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-catalog-content\") pod \"c82b81d6-f114-448b-b0c0-06ab7294d38e\" (UID: \"c82b81d6-f114-448b-b0c0-06ab7294d38e\") " Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.271196 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-utilities" (OuterVolumeSpecName: "utilities") pod "c82b81d6-f114-448b-b0c0-06ab7294d38e" (UID: "c82b81d6-f114-448b-b0c0-06ab7294d38e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.271504 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.285884 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82b81d6-f114-448b-b0c0-06ab7294d38e-kube-api-access-jzjb2" (OuterVolumeSpecName: "kube-api-access-jzjb2") pod "c82b81d6-f114-448b-b0c0-06ab7294d38e" (UID: "c82b81d6-f114-448b-b0c0-06ab7294d38e"). InnerVolumeSpecName "kube-api-access-jzjb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.369436 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c82b81d6-f114-448b-b0c0-06ab7294d38e" (UID: "c82b81d6-f114-448b-b0c0-06ab7294d38e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.373031 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzjb2\" (UniqueName: \"kubernetes.io/projected/c82b81d6-f114-448b-b0c0-06ab7294d38e-kube-api-access-jzjb2\") on node \"crc\" DevicePath \"\"" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.373067 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c82b81d6-f114-448b-b0c0-06ab7294d38e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.580530 4744 generic.go:334] "Generic (PLEG): container finished" podID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerID="7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762" exitCode=0 Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.580573 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbhx6" event={"ID":"c82b81d6-f114-448b-b0c0-06ab7294d38e","Type":"ContainerDied","Data":"7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762"} Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.580609 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbhx6" event={"ID":"c82b81d6-f114-448b-b0c0-06ab7294d38e","Type":"ContainerDied","Data":"a7bb5bf932f0d258e05c793c3f0b2f455875e31fe7e2d2783c7a9def9ba570d1"} Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.580666 4744 scope.go:117] "RemoveContainer" containerID="7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.580686 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbhx6" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.635728 4744 scope.go:117] "RemoveContainer" containerID="21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.640410 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbhx6"] Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.656517 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbhx6"] Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.683469 4744 scope.go:117] "RemoveContainer" containerID="03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.730913 4744 scope.go:117] "RemoveContainer" containerID="7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762" Sep 30 03:50:50 crc kubenswrapper[4744]: E0930 03:50:50.731429 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762\": container with ID starting with 7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762 not found: ID does not exist" containerID="7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.731469 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762"} err="failed to get container status \"7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762\": rpc error: code = NotFound desc = could not find container \"7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762\": container with ID starting with 7f5f0f0047ca7c0399e9ee07c7d93138a545b303e59f97b45d922b1816d00762 not found: ID does not exist" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.731496 4744 scope.go:117] "RemoveContainer" containerID="21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6" Sep 30 03:50:50 crc kubenswrapper[4744]: E0930 03:50:50.731900 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6\": container with ID starting with 21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6 not found: ID does not exist" containerID="21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.731949 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6"} err="failed to get container status \"21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6\": rpc error: code = NotFound desc = could not find container \"21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6\": container with ID starting with 21cb42a6f83576bc1b05d636e9c6f14c4c1a5e272a6cf558e97e8243511d28e6 not found: ID does not exist" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.731983 4744 scope.go:117] "RemoveContainer" containerID="03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7" Sep 30 03:50:50 crc kubenswrapper[4744]: E0930 03:50:50.732335 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7\": container with ID starting with 03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7 not found: ID does not exist" containerID="03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7" Sep 30 03:50:50 crc kubenswrapper[4744]: I0930 03:50:50.732460 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7"} err="failed to get container status \"03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7\": rpc error: code = NotFound desc = could not find container \"03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7\": container with ID starting with 03d21d71f8795f8023b70356c8fbcfbc8439b9b4308f9c42eb02c5e5900010e7 not found: ID does not exist" Sep 30 03:50:51 crc kubenswrapper[4744]: I0930 03:50:51.522924 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" path="/var/lib/kubelet/pods/c82b81d6-f114-448b-b0c0-06ab7294d38e/volumes" Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.348271 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.349014 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.349087 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.350272 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99fe49680c1b6aa39bba2bdcee84bc111f502657f8907c2575eed1d9238b45f4"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.350430 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://99fe49680c1b6aa39bba2bdcee84bc111f502657f8907c2575eed1d9238b45f4" gracePeriod=600 Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.737473 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="99fe49680c1b6aa39bba2bdcee84bc111f502657f8907c2575eed1d9238b45f4" exitCode=0 Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.737550 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"99fe49680c1b6aa39bba2bdcee84bc111f502657f8907c2575eed1d9238b45f4"} Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.737857 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e"} Sep 30 03:51:04 crc kubenswrapper[4744]: I0930 03:51:04.737882 4744 scope.go:117] "RemoveContainer" containerID="54133c2c1d167ec7e57ce864adbcfd60eeef56d08fea187477b1368b0850c194" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.686526 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qlpz2"] Sep 30 03:52:24 crc kubenswrapper[4744]: E0930 03:52:24.687395 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="extract-content" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687407 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="extract-content" Sep 30 03:52:24 crc kubenswrapper[4744]: E0930 03:52:24.687423 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="extract-content" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687429 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="extract-content" Sep 30 03:52:24 crc kubenswrapper[4744]: E0930 03:52:24.687442 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="extract-utilities" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687448 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="extract-utilities" Sep 30 03:52:24 crc kubenswrapper[4744]: E0930 03:52:24.687461 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="registry-server" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687467 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="registry-server" Sep 30 03:52:24 crc kubenswrapper[4744]: E0930 03:52:24.687484 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="registry-server" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687490 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="registry-server" Sep 30 03:52:24 crc kubenswrapper[4744]: E0930 03:52:24.687502 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="extract-utilities" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687508 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="extract-utilities" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687678 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33c51d7-f47e-41ca-9a3e-7093542f1232" containerName="registry-server" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.687691 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82b81d6-f114-448b-b0c0-06ab7294d38e" containerName="registry-server" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.689981 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.713128 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qlpz2"] Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.780895 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4j94\" (UniqueName: \"kubernetes.io/projected/7369b47f-c90e-42c9-859c-ef693d18e0e2-kube-api-access-g4j94\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.780984 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-catalog-content\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.781111 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-utilities\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.883329 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-utilities\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.883465 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4j94\" (UniqueName: \"kubernetes.io/projected/7369b47f-c90e-42c9-859c-ef693d18e0e2-kube-api-access-g4j94\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.883505 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-catalog-content\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.883954 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-utilities\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.883961 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-catalog-content\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:24 crc kubenswrapper[4744]: I0930 03:52:24.903617 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4j94\" (UniqueName: \"kubernetes.io/projected/7369b47f-c90e-42c9-859c-ef693d18e0e2-kube-api-access-g4j94\") pod \"community-operators-qlpz2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:25 crc kubenswrapper[4744]: I0930 03:52:25.035928 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:25 crc kubenswrapper[4744]: I0930 03:52:25.546625 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qlpz2"] Sep 30 03:52:26 crc kubenswrapper[4744]: I0930 03:52:26.531800 4744 generic.go:334] "Generic (PLEG): container finished" podID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerID="6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5" exitCode=0 Sep 30 03:52:26 crc kubenswrapper[4744]: I0930 03:52:26.531853 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlpz2" event={"ID":"7369b47f-c90e-42c9-859c-ef693d18e0e2","Type":"ContainerDied","Data":"6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5"} Sep 30 03:52:26 crc kubenswrapper[4744]: I0930 03:52:26.533247 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlpz2" event={"ID":"7369b47f-c90e-42c9-859c-ef693d18e0e2","Type":"ContainerStarted","Data":"deeae08477e2aca2c17649575e5cd168abc24ded03a2f451dee7b67698f83642"} Sep 30 03:52:26 crc kubenswrapper[4744]: I0930 03:52:26.534518 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 03:52:27 crc kubenswrapper[4744]: I0930 03:52:27.544483 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlpz2" event={"ID":"7369b47f-c90e-42c9-859c-ef693d18e0e2","Type":"ContainerStarted","Data":"64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633"} Sep 30 03:52:29 crc kubenswrapper[4744]: I0930 03:52:29.565613 4744 generic.go:334] "Generic (PLEG): container finished" podID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerID="64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633" exitCode=0 Sep 30 03:52:29 crc kubenswrapper[4744]: I0930 03:52:29.566163 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlpz2" event={"ID":"7369b47f-c90e-42c9-859c-ef693d18e0e2","Type":"ContainerDied","Data":"64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633"} Sep 30 03:52:30 crc kubenswrapper[4744]: I0930 03:52:30.579512 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlpz2" event={"ID":"7369b47f-c90e-42c9-859c-ef693d18e0e2","Type":"ContainerStarted","Data":"9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2"} Sep 30 03:52:30 crc kubenswrapper[4744]: I0930 03:52:30.607146 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qlpz2" podStartSLOduration=3.19263679 podStartE2EDuration="6.607125764s" podCreationTimestamp="2025-09-30 03:52:24 +0000 UTC" firstStartedPulling="2025-09-30 03:52:26.53411508 +0000 UTC m=+3473.707335084" lastFinishedPulling="2025-09-30 03:52:29.948604074 +0000 UTC m=+3477.121824058" observedRunningTime="2025-09-30 03:52:30.59900357 +0000 UTC m=+3477.772223554" watchObservedRunningTime="2025-09-30 03:52:30.607125764 +0000 UTC m=+3477.780345748" Sep 30 03:52:35 crc kubenswrapper[4744]: I0930 03:52:35.039007 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:35 crc kubenswrapper[4744]: I0930 03:52:35.042253 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:35 crc kubenswrapper[4744]: I0930 03:52:35.095252 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:35 crc kubenswrapper[4744]: I0930 03:52:35.715306 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:35 crc kubenswrapper[4744]: I0930 03:52:35.764204 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qlpz2"] Sep 30 03:52:37 crc kubenswrapper[4744]: I0930 03:52:37.657813 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qlpz2" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="registry-server" containerID="cri-o://9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2" gracePeriod=2 Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.441748 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.550234 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4j94\" (UniqueName: \"kubernetes.io/projected/7369b47f-c90e-42c9-859c-ef693d18e0e2-kube-api-access-g4j94\") pod \"7369b47f-c90e-42c9-859c-ef693d18e0e2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.550345 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-utilities\") pod \"7369b47f-c90e-42c9-859c-ef693d18e0e2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.550402 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-catalog-content\") pod \"7369b47f-c90e-42c9-859c-ef693d18e0e2\" (UID: \"7369b47f-c90e-42c9-859c-ef693d18e0e2\") " Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.551724 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-utilities" (OuterVolumeSpecName: "utilities") pod "7369b47f-c90e-42c9-859c-ef693d18e0e2" (UID: "7369b47f-c90e-42c9-859c-ef693d18e0e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.552675 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.556284 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7369b47f-c90e-42c9-859c-ef693d18e0e2-kube-api-access-g4j94" (OuterVolumeSpecName: "kube-api-access-g4j94") pod "7369b47f-c90e-42c9-859c-ef693d18e0e2" (UID: "7369b47f-c90e-42c9-859c-ef693d18e0e2"). InnerVolumeSpecName "kube-api-access-g4j94". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.620021 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7369b47f-c90e-42c9-859c-ef693d18e0e2" (UID: "7369b47f-c90e-42c9-859c-ef693d18e0e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.654532 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4j94\" (UniqueName: \"kubernetes.io/projected/7369b47f-c90e-42c9-859c-ef693d18e0e2-kube-api-access-g4j94\") on node \"crc\" DevicePath \"\"" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.654568 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7369b47f-c90e-42c9-859c-ef693d18e0e2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.667679 4744 generic.go:334] "Generic (PLEG): container finished" podID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerID="9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2" exitCode=0 Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.667719 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlpz2" event={"ID":"7369b47f-c90e-42c9-859c-ef693d18e0e2","Type":"ContainerDied","Data":"9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2"} Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.667748 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qlpz2" event={"ID":"7369b47f-c90e-42c9-859c-ef693d18e0e2","Type":"ContainerDied","Data":"deeae08477e2aca2c17649575e5cd168abc24ded03a2f451dee7b67698f83642"} Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.667747 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qlpz2" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.667766 4744 scope.go:117] "RemoveContainer" containerID="9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.690863 4744 scope.go:117] "RemoveContainer" containerID="64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.701914 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qlpz2"] Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.710770 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qlpz2"] Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.723299 4744 scope.go:117] "RemoveContainer" containerID="6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.748961 4744 scope.go:117] "RemoveContainer" containerID="9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2" Sep 30 03:52:38 crc kubenswrapper[4744]: E0930 03:52:38.749380 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2\": container with ID starting with 9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2 not found: ID does not exist" containerID="9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.749413 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2"} err="failed to get container status \"9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2\": rpc error: code = NotFound desc = could not find container \"9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2\": container with ID starting with 9f7333a41f0d7187c3f86d4e07d69be548a4e7738b9079d5075f7314644c27b2 not found: ID does not exist" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.749433 4744 scope.go:117] "RemoveContainer" containerID="64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633" Sep 30 03:52:38 crc kubenswrapper[4744]: E0930 03:52:38.749754 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633\": container with ID starting with 64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633 not found: ID does not exist" containerID="64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.749791 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633"} err="failed to get container status \"64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633\": rpc error: code = NotFound desc = could not find container \"64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633\": container with ID starting with 64c4c1536ffaeabdc0b236b7c4b2440f7803c8ec3741b8f3c06e10c0b3d93633 not found: ID does not exist" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.749818 4744 scope.go:117] "RemoveContainer" containerID="6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5" Sep 30 03:52:38 crc kubenswrapper[4744]: E0930 03:52:38.750245 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5\": container with ID starting with 6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5 not found: ID does not exist" containerID="6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5" Sep 30 03:52:38 crc kubenswrapper[4744]: I0930 03:52:38.750314 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5"} err="failed to get container status \"6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5\": rpc error: code = NotFound desc = could not find container \"6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5\": container with ID starting with 6a120e80d1bfc075289872c1179d7f9211283d3bb7ce9a0f7641cc1a802a0dd5 not found: ID does not exist" Sep 30 03:52:39 crc kubenswrapper[4744]: I0930 03:52:39.522139 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" path="/var/lib/kubelet/pods/7369b47f-c90e-42c9-859c-ef693d18e0e2/volumes" Sep 30 03:53:04 crc kubenswrapper[4744]: I0930 03:53:04.348451 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:53:04 crc kubenswrapper[4744]: I0930 03:53:04.349029 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:53:34 crc kubenswrapper[4744]: I0930 03:53:34.348005 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:53:34 crc kubenswrapper[4744]: I0930 03:53:34.348583 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.347558 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.347995 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.348041 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.348813 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.348856 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" gracePeriod=600 Sep 30 03:54:04 crc kubenswrapper[4744]: E0930 03:54:04.469292 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.489419 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" exitCode=0 Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.489533 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e"} Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.489697 4744 scope.go:117] "RemoveContainer" containerID="99fe49680c1b6aa39bba2bdcee84bc111f502657f8907c2575eed1d9238b45f4" Sep 30 03:54:04 crc kubenswrapper[4744]: I0930 03:54:04.490374 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:54:04 crc kubenswrapper[4744]: E0930 03:54:04.490623 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:54:17 crc kubenswrapper[4744]: I0930 03:54:17.503276 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:54:17 crc kubenswrapper[4744]: E0930 03:54:17.504107 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:54:29 crc kubenswrapper[4744]: I0930 03:54:29.504809 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:54:29 crc kubenswrapper[4744]: E0930 03:54:29.505641 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:54:42 crc kubenswrapper[4744]: I0930 03:54:42.503999 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:54:42 crc kubenswrapper[4744]: E0930 03:54:42.505086 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:54:56 crc kubenswrapper[4744]: I0930 03:54:56.504449 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:54:56 crc kubenswrapper[4744]: E0930 03:54:56.505684 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:55:08 crc kubenswrapper[4744]: I0930 03:55:08.504413 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:55:08 crc kubenswrapper[4744]: E0930 03:55:08.505269 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:55:21 crc kubenswrapper[4744]: I0930 03:55:21.510007 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:55:21 crc kubenswrapper[4744]: E0930 03:55:21.510796 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:55:34 crc kubenswrapper[4744]: I0930 03:55:34.503790 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:55:34 crc kubenswrapper[4744]: E0930 03:55:34.504933 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:55:46 crc kubenswrapper[4744]: I0930 03:55:46.503887 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:55:46 crc kubenswrapper[4744]: E0930 03:55:46.504949 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:55:59 crc kubenswrapper[4744]: I0930 03:55:59.504511 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:55:59 crc kubenswrapper[4744]: E0930 03:55:59.505218 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:56:14 crc kubenswrapper[4744]: I0930 03:56:14.503570 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:56:14 crc kubenswrapper[4744]: E0930 03:56:14.504152 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:56:26 crc kubenswrapper[4744]: I0930 03:56:26.503749 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:56:26 crc kubenswrapper[4744]: E0930 03:56:26.504948 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:56:41 crc kubenswrapper[4744]: I0930 03:56:41.504970 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:56:41 crc kubenswrapper[4744]: E0930 03:56:41.505793 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:56:54 crc kubenswrapper[4744]: I0930 03:56:54.504042 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:56:54 crc kubenswrapper[4744]: E0930 03:56:54.504835 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.275775 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5l8x8"] Sep 30 03:56:55 crc kubenswrapper[4744]: E0930 03:56:55.276428 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="extract-content" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.276444 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="extract-content" Sep 30 03:56:55 crc kubenswrapper[4744]: E0930 03:56:55.276469 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="extract-utilities" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.276475 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="extract-utilities" Sep 30 03:56:55 crc kubenswrapper[4744]: E0930 03:56:55.276490 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="registry-server" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.276497 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="registry-server" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.276714 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="7369b47f-c90e-42c9-859c-ef693d18e0e2" containerName="registry-server" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.280013 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.294713 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l8x8"] Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.343627 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkn2p\" (UniqueName: \"kubernetes.io/projected/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-kube-api-access-pkn2p\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.343765 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-utilities\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.343822 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-catalog-content\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.445967 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkn2p\" (UniqueName: \"kubernetes.io/projected/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-kube-api-access-pkn2p\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.446062 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-utilities\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.446105 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-catalog-content\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.447661 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-catalog-content\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.447662 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-utilities\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.472311 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkn2p\" (UniqueName: \"kubernetes.io/projected/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-kube-api-access-pkn2p\") pod \"redhat-marketplace-5l8x8\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:55 crc kubenswrapper[4744]: I0930 03:56:55.599150 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:56:56 crc kubenswrapper[4744]: I0930 03:56:56.250921 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l8x8"] Sep 30 03:56:56 crc kubenswrapper[4744]: W0930 03:56:56.255776 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e0e8f0_5dbb_42d4_b7d0_99972ad8c2b0.slice/crio-5f1ddb10d7869883767f7f4f614ba8f09d46066f3796d62ad444c3ffbf6a6e4b WatchSource:0}: Error finding container 5f1ddb10d7869883767f7f4f614ba8f09d46066f3796d62ad444c3ffbf6a6e4b: Status 404 returned error can't find the container with id 5f1ddb10d7869883767f7f4f614ba8f09d46066f3796d62ad444c3ffbf6a6e4b Sep 30 03:56:57 crc kubenswrapper[4744]: I0930 03:56:57.216949 4744 generic.go:334] "Generic (PLEG): container finished" podID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerID="696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9" exitCode=0 Sep 30 03:56:57 crc kubenswrapper[4744]: I0930 03:56:57.217323 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l8x8" event={"ID":"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0","Type":"ContainerDied","Data":"696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9"} Sep 30 03:56:57 crc kubenswrapper[4744]: I0930 03:56:57.217387 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l8x8" event={"ID":"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0","Type":"ContainerStarted","Data":"5f1ddb10d7869883767f7f4f614ba8f09d46066f3796d62ad444c3ffbf6a6e4b"} Sep 30 03:56:58 crc kubenswrapper[4744]: I0930 03:56:58.226216 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l8x8" event={"ID":"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0","Type":"ContainerStarted","Data":"51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0"} Sep 30 03:56:59 crc kubenswrapper[4744]: I0930 03:56:59.245132 4744 generic.go:334] "Generic (PLEG): container finished" podID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerID="51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0" exitCode=0 Sep 30 03:56:59 crc kubenswrapper[4744]: I0930 03:56:59.245187 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l8x8" event={"ID":"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0","Type":"ContainerDied","Data":"51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0"} Sep 30 03:57:00 crc kubenswrapper[4744]: I0930 03:57:00.257936 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l8x8" event={"ID":"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0","Type":"ContainerStarted","Data":"4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12"} Sep 30 03:57:00 crc kubenswrapper[4744]: I0930 03:57:00.281192 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5l8x8" podStartSLOduration=2.867675522 podStartE2EDuration="5.281174251s" podCreationTimestamp="2025-09-30 03:56:55 +0000 UTC" firstStartedPulling="2025-09-30 03:56:57.219257878 +0000 UTC m=+3744.392477842" lastFinishedPulling="2025-09-30 03:56:59.632756597 +0000 UTC m=+3746.805976571" observedRunningTime="2025-09-30 03:57:00.273987227 +0000 UTC m=+3747.447207221" watchObservedRunningTime="2025-09-30 03:57:00.281174251 +0000 UTC m=+3747.454394225" Sep 30 03:57:05 crc kubenswrapper[4744]: I0930 03:57:05.503959 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:57:05 crc kubenswrapper[4744]: E0930 03:57:05.504769 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:57:05 crc kubenswrapper[4744]: I0930 03:57:05.602741 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:57:05 crc kubenswrapper[4744]: I0930 03:57:05.602888 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:57:05 crc kubenswrapper[4744]: I0930 03:57:05.891947 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:57:06 crc kubenswrapper[4744]: I0930 03:57:06.395230 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:57:06 crc kubenswrapper[4744]: I0930 03:57:06.457399 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l8x8"] Sep 30 03:57:08 crc kubenswrapper[4744]: I0930 03:57:08.333323 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5l8x8" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="registry-server" containerID="cri-o://4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12" gracePeriod=2 Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.176797 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.187251 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkn2p\" (UniqueName: \"kubernetes.io/projected/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-kube-api-access-pkn2p\") pod \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.187446 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-catalog-content\") pod \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.187529 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-utilities\") pod \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\" (UID: \"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0\") " Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.188965 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-utilities" (OuterVolumeSpecName: "utilities") pod "b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" (UID: "b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.196664 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-kube-api-access-pkn2p" (OuterVolumeSpecName: "kube-api-access-pkn2p") pod "b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" (UID: "b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0"). InnerVolumeSpecName "kube-api-access-pkn2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.207970 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" (UID: "b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.288993 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkn2p\" (UniqueName: \"kubernetes.io/projected/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-kube-api-access-pkn2p\") on node \"crc\" DevicePath \"\"" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.289025 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.289035 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.348569 4744 generic.go:334] "Generic (PLEG): container finished" podID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerID="4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12" exitCode=0 Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.348632 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l8x8" event={"ID":"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0","Type":"ContainerDied","Data":"4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12"} Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.348673 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5l8x8" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.348707 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5l8x8" event={"ID":"b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0","Type":"ContainerDied","Data":"5f1ddb10d7869883767f7f4f614ba8f09d46066f3796d62ad444c3ffbf6a6e4b"} Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.348746 4744 scope.go:117] "RemoveContainer" containerID="4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.375540 4744 scope.go:117] "RemoveContainer" containerID="51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.394462 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l8x8"] Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.402299 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5l8x8"] Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.410112 4744 scope.go:117] "RemoveContainer" containerID="696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.448072 4744 scope.go:117] "RemoveContainer" containerID="4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12" Sep 30 03:57:09 crc kubenswrapper[4744]: E0930 03:57:09.449541 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12\": container with ID starting with 4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12 not found: ID does not exist" containerID="4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.449593 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12"} err="failed to get container status \"4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12\": rpc error: code = NotFound desc = could not find container \"4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12\": container with ID starting with 4fea61fb58cd8a2d1b551b7d3826cedb37d454d6a8e5a26f703553d7b00d9b12 not found: ID does not exist" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.449625 4744 scope.go:117] "RemoveContainer" containerID="51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0" Sep 30 03:57:09 crc kubenswrapper[4744]: E0930 03:57:09.449963 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0\": container with ID starting with 51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0 not found: ID does not exist" containerID="51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.450051 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0"} err="failed to get container status \"51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0\": rpc error: code = NotFound desc = could not find container \"51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0\": container with ID starting with 51fd3f70128a784824e27da9c1ceef3dd72dd77a5564a940175f6f11ef823fe0 not found: ID does not exist" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.450128 4744 scope.go:117] "RemoveContainer" containerID="696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9" Sep 30 03:57:09 crc kubenswrapper[4744]: E0930 03:57:09.450571 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9\": container with ID starting with 696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9 not found: ID does not exist" containerID="696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.450603 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9"} err="failed to get container status \"696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9\": rpc error: code = NotFound desc = could not find container \"696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9\": container with ID starting with 696b50d333fa2709a1415baed747a3cf7b641456fced0b7ef58985378f97c9a9 not found: ID does not exist" Sep 30 03:57:09 crc kubenswrapper[4744]: I0930 03:57:09.519274 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" path="/var/lib/kubelet/pods/b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0/volumes" Sep 30 03:57:18 crc kubenswrapper[4744]: I0930 03:57:18.504145 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:57:18 crc kubenswrapper[4744]: E0930 03:57:18.504996 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:57:26 crc kubenswrapper[4744]: I0930 03:57:26.736877 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 03:57:30 crc kubenswrapper[4744]: I0930 03:57:30.504270 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:57:30 crc kubenswrapper[4744]: E0930 03:57:30.505154 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:57:45 crc kubenswrapper[4744]: I0930 03:57:45.503893 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:57:45 crc kubenswrapper[4744]: E0930 03:57:45.504744 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:57:56 crc kubenswrapper[4744]: I0930 03:57:56.503990 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:57:56 crc kubenswrapper[4744]: E0930 03:57:56.504725 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:58:09 crc kubenswrapper[4744]: I0930 03:58:09.503976 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:58:09 crc kubenswrapper[4744]: E0930 03:58:09.504845 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:58:21 crc kubenswrapper[4744]: I0930 03:58:21.503504 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:58:21 crc kubenswrapper[4744]: E0930 03:58:21.504184 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:58:34 crc kubenswrapper[4744]: I0930 03:58:34.503170 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:58:34 crc kubenswrapper[4744]: E0930 03:58:34.503889 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:58:46 crc kubenswrapper[4744]: I0930 03:58:46.503785 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:58:46 crc kubenswrapper[4744]: E0930 03:58:46.504979 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:59:01 crc kubenswrapper[4744]: I0930 03:59:01.504252 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:59:01 crc kubenswrapper[4744]: E0930 03:59:01.505321 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 03:59:15 crc kubenswrapper[4744]: I0930 03:59:15.504837 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 03:59:16 crc kubenswrapper[4744]: I0930 03:59:16.575927 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"77c136cce14743e0c268172e528736b39669a7a27f3fec9c8c18d04a6a426adf"} Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.180264 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t"] Sep 30 04:00:00 crc kubenswrapper[4744]: E0930 04:00:00.182535 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="registry-server" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.182622 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="registry-server" Sep 30 04:00:00 crc kubenswrapper[4744]: E0930 04:00:00.182759 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="extract-utilities" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.182819 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="extract-utilities" Sep 30 04:00:00 crc kubenswrapper[4744]: E0930 04:00:00.182882 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="extract-content" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.183495 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="extract-content" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.183913 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e0e8f0-5dbb-42d4-b7d0-99972ad8c2b0" containerName="registry-server" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.184639 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.186797 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.186866 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.193713 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t"] Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.270722 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40eaa93c-e137-4d09-815b-48d968ac955c-config-volume\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.270940 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxn4f\" (UniqueName: \"kubernetes.io/projected/40eaa93c-e137-4d09-815b-48d968ac955c-kube-api-access-mxn4f\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.271025 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40eaa93c-e137-4d09-815b-48d968ac955c-secret-volume\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.372789 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40eaa93c-e137-4d09-815b-48d968ac955c-config-volume\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.372988 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxn4f\" (UniqueName: \"kubernetes.io/projected/40eaa93c-e137-4d09-815b-48d968ac955c-kube-api-access-mxn4f\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.373068 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40eaa93c-e137-4d09-815b-48d968ac955c-secret-volume\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.373801 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40eaa93c-e137-4d09-815b-48d968ac955c-config-volume\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.381173 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40eaa93c-e137-4d09-815b-48d968ac955c-secret-volume\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.408020 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxn4f\" (UniqueName: \"kubernetes.io/projected/40eaa93c-e137-4d09-815b-48d968ac955c-kube-api-access-mxn4f\") pod \"collect-profiles-29320080-t9g4t\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:00 crc kubenswrapper[4744]: I0930 04:00:00.508276 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:01 crc kubenswrapper[4744]: I0930 04:00:01.028230 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t"] Sep 30 04:00:01 crc kubenswrapper[4744]: I0930 04:00:01.958473 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" event={"ID":"40eaa93c-e137-4d09-815b-48d968ac955c","Type":"ContainerStarted","Data":"6d513d62a48fadcde004c0471fe586832ed6d896f1b59da29550c7df2ff1f04e"} Sep 30 04:00:01 crc kubenswrapper[4744]: I0930 04:00:01.958828 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" event={"ID":"40eaa93c-e137-4d09-815b-48d968ac955c","Type":"ContainerStarted","Data":"70225d8c7ad7690ff74f50ec80bef064b43dc84cfe39e0482b5e92b6e3da4cf7"} Sep 30 04:00:01 crc kubenswrapper[4744]: I0930 04:00:01.977683 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" podStartSLOduration=1.9776665960000002 podStartE2EDuration="1.977666596s" podCreationTimestamp="2025-09-30 04:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 04:00:01.971149643 +0000 UTC m=+3929.144369617" watchObservedRunningTime="2025-09-30 04:00:01.977666596 +0000 UTC m=+3929.150886570" Sep 30 04:00:02 crc kubenswrapper[4744]: I0930 04:00:02.970350 4744 generic.go:334] "Generic (PLEG): container finished" podID="40eaa93c-e137-4d09-815b-48d968ac955c" containerID="6d513d62a48fadcde004c0471fe586832ed6d896f1b59da29550c7df2ff1f04e" exitCode=0 Sep 30 04:00:02 crc kubenswrapper[4744]: I0930 04:00:02.970621 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" event={"ID":"40eaa93c-e137-4d09-815b-48d968ac955c","Type":"ContainerDied","Data":"6d513d62a48fadcde004c0471fe586832ed6d896f1b59da29550c7df2ff1f04e"} Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.585607 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.672140 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40eaa93c-e137-4d09-815b-48d968ac955c-config-volume\") pod \"40eaa93c-e137-4d09-815b-48d968ac955c\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.672202 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxn4f\" (UniqueName: \"kubernetes.io/projected/40eaa93c-e137-4d09-815b-48d968ac955c-kube-api-access-mxn4f\") pod \"40eaa93c-e137-4d09-815b-48d968ac955c\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.672361 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40eaa93c-e137-4d09-815b-48d968ac955c-secret-volume\") pod \"40eaa93c-e137-4d09-815b-48d968ac955c\" (UID: \"40eaa93c-e137-4d09-815b-48d968ac955c\") " Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.673067 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40eaa93c-e137-4d09-815b-48d968ac955c-config-volume" (OuterVolumeSpecName: "config-volume") pod "40eaa93c-e137-4d09-815b-48d968ac955c" (UID: "40eaa93c-e137-4d09-815b-48d968ac955c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.682037 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40eaa93c-e137-4d09-815b-48d968ac955c-kube-api-access-mxn4f" (OuterVolumeSpecName: "kube-api-access-mxn4f") pod "40eaa93c-e137-4d09-815b-48d968ac955c" (UID: "40eaa93c-e137-4d09-815b-48d968ac955c"). InnerVolumeSpecName "kube-api-access-mxn4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.683995 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40eaa93c-e137-4d09-815b-48d968ac955c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40eaa93c-e137-4d09-815b-48d968ac955c" (UID: "40eaa93c-e137-4d09-815b-48d968ac955c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.774525 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40eaa93c-e137-4d09-815b-48d968ac955c-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.774561 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40eaa93c-e137-4d09-815b-48d968ac955c-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.774571 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxn4f\" (UniqueName: \"kubernetes.io/projected/40eaa93c-e137-4d09-815b-48d968ac955c-kube-api-access-mxn4f\") on node \"crc\" DevicePath \"\"" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.987519 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" event={"ID":"40eaa93c-e137-4d09-815b-48d968ac955c","Type":"ContainerDied","Data":"70225d8c7ad7690ff74f50ec80bef064b43dc84cfe39e0482b5e92b6e3da4cf7"} Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.987753 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70225d8c7ad7690ff74f50ec80bef064b43dc84cfe39e0482b5e92b6e3da4cf7" Sep 30 04:00:04 crc kubenswrapper[4744]: I0930 04:00:04.987564 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320080-t9g4t" Sep 30 04:00:05 crc kubenswrapper[4744]: I0930 04:00:05.058411 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2"] Sep 30 04:00:05 crc kubenswrapper[4744]: I0930 04:00:05.066758 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320035-2h8r2"] Sep 30 04:00:05 crc kubenswrapper[4744]: I0930 04:00:05.520795 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c155cc9-f717-47c3-924e-5fbb08c82456" path="/var/lib/kubelet/pods/7c155cc9-f717-47c3-924e-5fbb08c82456/volumes" Sep 30 04:00:42 crc kubenswrapper[4744]: I0930 04:00:42.425501 4744 scope.go:117] "RemoveContainer" containerID="8f2cdf7dd7d721d6924555a7ea45c66f1015a9330a70a71342fbd1cbb353453a" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.154857 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320081-57dhj"] Sep 30 04:01:00 crc kubenswrapper[4744]: E0930 04:01:00.155756 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40eaa93c-e137-4d09-815b-48d968ac955c" containerName="collect-profiles" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.155771 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="40eaa93c-e137-4d09-815b-48d968ac955c" containerName="collect-profiles" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.156027 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="40eaa93c-e137-4d09-815b-48d968ac955c" containerName="collect-profiles" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.156682 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.187697 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320081-57dhj"] Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.287580 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-fernet-keys\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.287966 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-combined-ca-bundle\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.288006 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-config-data\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.288031 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg5lh\" (UniqueName: \"kubernetes.io/projected/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-kube-api-access-cg5lh\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.389763 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-config-data\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.389806 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg5lh\" (UniqueName: \"kubernetes.io/projected/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-kube-api-access-cg5lh\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.389944 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-fernet-keys\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.390015 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-combined-ca-bundle\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.396898 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-config-data\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.397116 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-fernet-keys\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.413157 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-combined-ca-bundle\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.413240 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg5lh\" (UniqueName: \"kubernetes.io/projected/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-kube-api-access-cg5lh\") pod \"keystone-cron-29320081-57dhj\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.474858 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:00 crc kubenswrapper[4744]: I0930 04:01:00.968090 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320081-57dhj"] Sep 30 04:01:01 crc kubenswrapper[4744]: I0930 04:01:01.546785 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320081-57dhj" podStartSLOduration=1.546759308 podStartE2EDuration="1.546759308s" podCreationTimestamp="2025-09-30 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 04:01:01.538250262 +0000 UTC m=+3988.711470236" watchObservedRunningTime="2025-09-30 04:01:01.546759308 +0000 UTC m=+3988.719979292" Sep 30 04:01:01 crc kubenswrapper[4744]: I0930 04:01:01.547686 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320081-57dhj" event={"ID":"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec","Type":"ContainerStarted","Data":"974a1618b060d3ffdb652ccc3143b968f57f46ad34d077496237905e4f3cbb9b"} Sep 30 04:01:01 crc kubenswrapper[4744]: I0930 04:01:01.547718 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320081-57dhj" event={"ID":"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec","Type":"ContainerStarted","Data":"19814af567f11450bdcab4689f9d0f79860058a144deb75c564f7b482626c7c3"} Sep 30 04:01:04 crc kubenswrapper[4744]: I0930 04:01:04.533713 4744 generic.go:334] "Generic (PLEG): container finished" podID="6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" containerID="974a1618b060d3ffdb652ccc3143b968f57f46ad34d077496237905e4f3cbb9b" exitCode=0 Sep 30 04:01:04 crc kubenswrapper[4744]: I0930 04:01:04.533880 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320081-57dhj" event={"ID":"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec","Type":"ContainerDied","Data":"974a1618b060d3ffdb652ccc3143b968f57f46ad34d077496237905e4f3cbb9b"} Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.357043 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c7h9z"] Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.359802 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.382222 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c7h9z"] Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.387196 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-utilities\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.387321 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-catalog-content\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.387427 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxh5\" (UniqueName: \"kubernetes.io/projected/61284b85-f6e9-43cb-a34c-f1ee3800661e-kube-api-access-5rxh5\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.488832 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-utilities\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.489174 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-catalog-content\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.489333 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxh5\" (UniqueName: \"kubernetes.io/projected/61284b85-f6e9-43cb-a34c-f1ee3800661e-kube-api-access-5rxh5\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.489932 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-utilities\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.489940 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-catalog-content\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.526724 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxh5\" (UniqueName: \"kubernetes.io/projected/61284b85-f6e9-43cb-a34c-f1ee3800661e-kube-api-access-5rxh5\") pod \"redhat-operators-c7h9z\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:05 crc kubenswrapper[4744]: I0930 04:01:05.700882 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.347997 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.383536 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c7h9z"] Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.429209 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg5lh\" (UniqueName: \"kubernetes.io/projected/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-kube-api-access-cg5lh\") pod \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.429488 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-fernet-keys\") pod \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.429584 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-config-data\") pod \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.429626 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-combined-ca-bundle\") pod \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\" (UID: \"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec\") " Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.436356 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-kube-api-access-cg5lh" (OuterVolumeSpecName: "kube-api-access-cg5lh") pod "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" (UID: "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec"). InnerVolumeSpecName "kube-api-access-cg5lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.437908 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" (UID: "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.459744 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" (UID: "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.497761 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-config-data" (OuterVolumeSpecName: "config-data") pod "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" (UID: "6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.534977 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg5lh\" (UniqueName: \"kubernetes.io/projected/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-kube-api-access-cg5lh\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.535009 4744 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.535019 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.535029 4744 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.563263 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320081-57dhj" event={"ID":"6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec","Type":"ContainerDied","Data":"19814af567f11450bdcab4689f9d0f79860058a144deb75c564f7b482626c7c3"} Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.563296 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19814af567f11450bdcab4689f9d0f79860058a144deb75c564f7b482626c7c3" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.563303 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320081-57dhj" Sep 30 04:01:06 crc kubenswrapper[4744]: I0930 04:01:06.569817 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7h9z" event={"ID":"61284b85-f6e9-43cb-a34c-f1ee3800661e","Type":"ContainerStarted","Data":"3dec87c0c88089a53c8ae66281a587c17f0afb49d35a765dc2a665fa81e1cd08"} Sep 30 04:01:07 crc kubenswrapper[4744]: I0930 04:01:07.581454 4744 generic.go:334] "Generic (PLEG): container finished" podID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerID="f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e" exitCode=0 Sep 30 04:01:07 crc kubenswrapper[4744]: I0930 04:01:07.582084 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7h9z" event={"ID":"61284b85-f6e9-43cb-a34c-f1ee3800661e","Type":"ContainerDied","Data":"f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e"} Sep 30 04:01:07 crc kubenswrapper[4744]: I0930 04:01:07.585778 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 04:01:09 crc kubenswrapper[4744]: I0930 04:01:09.602258 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7h9z" event={"ID":"61284b85-f6e9-43cb-a34c-f1ee3800661e","Type":"ContainerStarted","Data":"00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824"} Sep 30 04:01:11 crc kubenswrapper[4744]: I0930 04:01:11.624623 4744 generic.go:334] "Generic (PLEG): container finished" podID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerID="00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824" exitCode=0 Sep 30 04:01:11 crc kubenswrapper[4744]: I0930 04:01:11.624704 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7h9z" event={"ID":"61284b85-f6e9-43cb-a34c-f1ee3800661e","Type":"ContainerDied","Data":"00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824"} Sep 30 04:01:12 crc kubenswrapper[4744]: I0930 04:01:12.635738 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7h9z" event={"ID":"61284b85-f6e9-43cb-a34c-f1ee3800661e","Type":"ContainerStarted","Data":"5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d"} Sep 30 04:01:12 crc kubenswrapper[4744]: I0930 04:01:12.654299 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c7h9z" podStartSLOduration=3.180556381 podStartE2EDuration="7.654278936s" podCreationTimestamp="2025-09-30 04:01:05 +0000 UTC" firstStartedPulling="2025-09-30 04:01:07.585261537 +0000 UTC m=+3994.758481551" lastFinishedPulling="2025-09-30 04:01:12.058984132 +0000 UTC m=+3999.232204106" observedRunningTime="2025-09-30 04:01:12.653666108 +0000 UTC m=+3999.826886082" watchObservedRunningTime="2025-09-30 04:01:12.654278936 +0000 UTC m=+3999.827498920" Sep 30 04:01:15 crc kubenswrapper[4744]: I0930 04:01:15.701611 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:15 crc kubenswrapper[4744]: I0930 04:01:15.702181 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:16 crc kubenswrapper[4744]: I0930 04:01:16.778096 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c7h9z" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="registry-server" probeResult="failure" output=< Sep 30 04:01:16 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 04:01:16 crc kubenswrapper[4744]: > Sep 30 04:01:25 crc kubenswrapper[4744]: I0930 04:01:25.781359 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:25 crc kubenswrapper[4744]: I0930 04:01:25.860264 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:26 crc kubenswrapper[4744]: I0930 04:01:26.027830 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c7h9z"] Sep 30 04:01:27 crc kubenswrapper[4744]: I0930 04:01:27.784660 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c7h9z" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="registry-server" containerID="cri-o://5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d" gracePeriod=2 Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.409254 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.523954 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-catalog-content\") pod \"61284b85-f6e9-43cb-a34c-f1ee3800661e\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.524131 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rxh5\" (UniqueName: \"kubernetes.io/projected/61284b85-f6e9-43cb-a34c-f1ee3800661e-kube-api-access-5rxh5\") pod \"61284b85-f6e9-43cb-a34c-f1ee3800661e\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.526246 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-utilities\") pod \"61284b85-f6e9-43cb-a34c-f1ee3800661e\" (UID: \"61284b85-f6e9-43cb-a34c-f1ee3800661e\") " Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.528538 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-utilities" (OuterVolumeSpecName: "utilities") pod "61284b85-f6e9-43cb-a34c-f1ee3800661e" (UID: "61284b85-f6e9-43cb-a34c-f1ee3800661e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.539040 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61284b85-f6e9-43cb-a34c-f1ee3800661e-kube-api-access-5rxh5" (OuterVolumeSpecName: "kube-api-access-5rxh5") pod "61284b85-f6e9-43cb-a34c-f1ee3800661e" (UID: "61284b85-f6e9-43cb-a34c-f1ee3800661e"). InnerVolumeSpecName "kube-api-access-5rxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.593946 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61284b85-f6e9-43cb-a34c-f1ee3800661e" (UID: "61284b85-f6e9-43cb-a34c-f1ee3800661e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.629665 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.629703 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61284b85-f6e9-43cb-a34c-f1ee3800661e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.629714 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rxh5\" (UniqueName: \"kubernetes.io/projected/61284b85-f6e9-43cb-a34c-f1ee3800661e-kube-api-access-5rxh5\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.794852 4744 generic.go:334] "Generic (PLEG): container finished" podID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerID="5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d" exitCode=0 Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.794901 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7h9z" event={"ID":"61284b85-f6e9-43cb-a34c-f1ee3800661e","Type":"ContainerDied","Data":"5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d"} Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.794932 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c7h9z" event={"ID":"61284b85-f6e9-43cb-a34c-f1ee3800661e","Type":"ContainerDied","Data":"3dec87c0c88089a53c8ae66281a587c17f0afb49d35a765dc2a665fa81e1cd08"} Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.794958 4744 scope.go:117] "RemoveContainer" containerID="5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.795525 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c7h9z" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.827681 4744 scope.go:117] "RemoveContainer" containerID="00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.839486 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c7h9z"] Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.847038 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c7h9z"] Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.860431 4744 scope.go:117] "RemoveContainer" containerID="f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.899798 4744 scope.go:117] "RemoveContainer" containerID="5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d" Sep 30 04:01:28 crc kubenswrapper[4744]: E0930 04:01:28.901663 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d\": container with ID starting with 5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d not found: ID does not exist" containerID="5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.901703 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d"} err="failed to get container status \"5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d\": rpc error: code = NotFound desc = could not find container \"5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d\": container with ID starting with 5d2df6b85feda7bf75498f4183a3c748b5f6b353706ea0d33126d8aee67eef3d not found: ID does not exist" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.901729 4744 scope.go:117] "RemoveContainer" containerID="00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824" Sep 30 04:01:28 crc kubenswrapper[4744]: E0930 04:01:28.902054 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824\": container with ID starting with 00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824 not found: ID does not exist" containerID="00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.902103 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824"} err="failed to get container status \"00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824\": rpc error: code = NotFound desc = could not find container \"00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824\": container with ID starting with 00c8d03ba2e5cfddaabb67243682d4288026124086a4a84dc4f42023896b0824 not found: ID does not exist" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.902133 4744 scope.go:117] "RemoveContainer" containerID="f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e" Sep 30 04:01:28 crc kubenswrapper[4744]: E0930 04:01:28.902420 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e\": container with ID starting with f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e not found: ID does not exist" containerID="f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e" Sep 30 04:01:28 crc kubenswrapper[4744]: I0930 04:01:28.902465 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e"} err="failed to get container status \"f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e\": rpc error: code = NotFound desc = could not find container \"f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e\": container with ID starting with f18c751662e28d3dc09c22bb630b6a79c6d24997e03d09786957e0a42875c18e not found: ID does not exist" Sep 30 04:01:29 crc kubenswrapper[4744]: I0930 04:01:29.514225 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" path="/var/lib/kubelet/pods/61284b85-f6e9-43cb-a34c-f1ee3800661e/volumes" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.443310 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8qgkj"] Sep 30 04:01:31 crc kubenswrapper[4744]: E0930 04:01:31.443976 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="extract-content" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.444018 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="extract-content" Sep 30 04:01:31 crc kubenswrapper[4744]: E0930 04:01:31.444064 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="extract-utilities" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.444078 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="extract-utilities" Sep 30 04:01:31 crc kubenswrapper[4744]: E0930 04:01:31.444108 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" containerName="keystone-cron" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.444120 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" containerName="keystone-cron" Sep 30 04:01:31 crc kubenswrapper[4744]: E0930 04:01:31.444149 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="registry-server" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.444161 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="registry-server" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.444536 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="61284b85-f6e9-43cb-a34c-f1ee3800661e" containerName="registry-server" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.444564 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec" containerName="keystone-cron" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.447218 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.470902 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qgkj"] Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.590707 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvz2d\" (UniqueName: \"kubernetes.io/projected/ad02b603-7a45-4c7b-a41a-0539f97e90c9-kube-api-access-vvz2d\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.590854 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-catalog-content\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.591652 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-utilities\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.693650 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvz2d\" (UniqueName: \"kubernetes.io/projected/ad02b603-7a45-4c7b-a41a-0539f97e90c9-kube-api-access-vvz2d\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.693773 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-catalog-content\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.693940 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-utilities\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.695119 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-utilities\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.695120 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-catalog-content\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:31 crc kubenswrapper[4744]: I0930 04:01:31.858581 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvz2d\" (UniqueName: \"kubernetes.io/projected/ad02b603-7a45-4c7b-a41a-0539f97e90c9-kube-api-access-vvz2d\") pod \"certified-operators-8qgkj\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:32 crc kubenswrapper[4744]: I0930 04:01:32.092432 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:32 crc kubenswrapper[4744]: I0930 04:01:32.609287 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qgkj"] Sep 30 04:01:32 crc kubenswrapper[4744]: I0930 04:01:32.836295 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerStarted","Data":"a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca"} Sep 30 04:01:32 crc kubenswrapper[4744]: I0930 04:01:32.836358 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerStarted","Data":"43c15e8362fd122665f67055b2d8053c338728d21319b36f0bb1d73df1c5ef2b"} Sep 30 04:01:33 crc kubenswrapper[4744]: I0930 04:01:33.848282 4744 generic.go:334] "Generic (PLEG): container finished" podID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerID="a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca" exitCode=0 Sep 30 04:01:33 crc kubenswrapper[4744]: I0930 04:01:33.848734 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerDied","Data":"a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca"} Sep 30 04:01:34 crc kubenswrapper[4744]: I0930 04:01:34.347588 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:01:34 crc kubenswrapper[4744]: I0930 04:01:34.347937 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:01:34 crc kubenswrapper[4744]: I0930 04:01:34.857953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerStarted","Data":"87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb"} Sep 30 04:01:35 crc kubenswrapper[4744]: I0930 04:01:35.868184 4744 generic.go:334] "Generic (PLEG): container finished" podID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerID="87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb" exitCode=0 Sep 30 04:01:35 crc kubenswrapper[4744]: I0930 04:01:35.868263 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerDied","Data":"87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb"} Sep 30 04:01:36 crc kubenswrapper[4744]: I0930 04:01:36.890073 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerStarted","Data":"102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8"} Sep 30 04:01:42 crc kubenswrapper[4744]: I0930 04:01:42.094169 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:42 crc kubenswrapper[4744]: I0930 04:01:42.094778 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:42 crc kubenswrapper[4744]: I0930 04:01:42.165774 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:42 crc kubenswrapper[4744]: I0930 04:01:42.198043 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8qgkj" podStartSLOduration=8.530819691 podStartE2EDuration="11.198026188s" podCreationTimestamp="2025-09-30 04:01:31 +0000 UTC" firstStartedPulling="2025-09-30 04:01:33.85120949 +0000 UTC m=+4021.024429464" lastFinishedPulling="2025-09-30 04:01:36.518415987 +0000 UTC m=+4023.691635961" observedRunningTime="2025-09-30 04:01:36.907742982 +0000 UTC m=+4024.080962956" watchObservedRunningTime="2025-09-30 04:01:42.198026188 +0000 UTC m=+4029.371246162" Sep 30 04:01:43 crc kubenswrapper[4744]: I0930 04:01:43.030812 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:43 crc kubenswrapper[4744]: I0930 04:01:43.096004 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qgkj"] Sep 30 04:01:44 crc kubenswrapper[4744]: I0930 04:01:44.980527 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8qgkj" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="registry-server" containerID="cri-o://102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8" gracePeriod=2 Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.867030 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.980482 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-utilities\") pod \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.980544 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-catalog-content\") pod \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.980828 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvz2d\" (UniqueName: \"kubernetes.io/projected/ad02b603-7a45-4c7b-a41a-0539f97e90c9-kube-api-access-vvz2d\") pod \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\" (UID: \"ad02b603-7a45-4c7b-a41a-0539f97e90c9\") " Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.981341 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-utilities" (OuterVolumeSpecName: "utilities") pod "ad02b603-7a45-4c7b-a41a-0539f97e90c9" (UID: "ad02b603-7a45-4c7b-a41a-0539f97e90c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.991866 4744 generic.go:334] "Generic (PLEG): container finished" podID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerID="102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8" exitCode=0 Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.992108 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerDied","Data":"102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8"} Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.992264 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qgkj" event={"ID":"ad02b603-7a45-4c7b-a41a-0539f97e90c9","Type":"ContainerDied","Data":"43c15e8362fd122665f67055b2d8053c338728d21319b36f0bb1d73df1c5ef2b"} Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.992283 4744 scope.go:117] "RemoveContainer" containerID="102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8" Sep 30 04:01:45 crc kubenswrapper[4744]: I0930 04:01:45.992407 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qgkj" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.000524 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad02b603-7a45-4c7b-a41a-0539f97e90c9-kube-api-access-vvz2d" (OuterVolumeSpecName: "kube-api-access-vvz2d") pod "ad02b603-7a45-4c7b-a41a-0539f97e90c9" (UID: "ad02b603-7a45-4c7b-a41a-0539f97e90c9"). InnerVolumeSpecName "kube-api-access-vvz2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.023713 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad02b603-7a45-4c7b-a41a-0539f97e90c9" (UID: "ad02b603-7a45-4c7b-a41a-0539f97e90c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.049927 4744 scope.go:117] "RemoveContainer" containerID="87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.081256 4744 scope.go:117] "RemoveContainer" containerID="a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.082589 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.082617 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvz2d\" (UniqueName: \"kubernetes.io/projected/ad02b603-7a45-4c7b-a41a-0539f97e90c9-kube-api-access-vvz2d\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.082629 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad02b603-7a45-4c7b-a41a-0539f97e90c9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.118333 4744 scope.go:117] "RemoveContainer" containerID="102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8" Sep 30 04:01:46 crc kubenswrapper[4744]: E0930 04:01:46.118834 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8\": container with ID starting with 102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8 not found: ID does not exist" containerID="102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.118893 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8"} err="failed to get container status \"102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8\": rpc error: code = NotFound desc = could not find container \"102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8\": container with ID starting with 102572d5c048c312850cb48fd1ee78c59f0a34c356cdd5bc9ac864329972b6d8 not found: ID does not exist" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.118926 4744 scope.go:117] "RemoveContainer" containerID="87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb" Sep 30 04:01:46 crc kubenswrapper[4744]: E0930 04:01:46.119237 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb\": container with ID starting with 87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb not found: ID does not exist" containerID="87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.119268 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb"} err="failed to get container status \"87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb\": rpc error: code = NotFound desc = could not find container \"87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb\": container with ID starting with 87621127bfd31bb19ca6b110563085911ea59c029932fc6af7b9958b6516dcfb not found: ID does not exist" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.119287 4744 scope.go:117] "RemoveContainer" containerID="a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca" Sep 30 04:01:46 crc kubenswrapper[4744]: E0930 04:01:46.119515 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca\": container with ID starting with a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca not found: ID does not exist" containerID="a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.119538 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca"} err="failed to get container status \"a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca\": rpc error: code = NotFound desc = could not find container \"a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca\": container with ID starting with a003b584efd4df8bf5fb22c113ccfe8280128553c2a64c12a7662536841a6bca not found: ID does not exist" Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.321690 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qgkj"] Sep 30 04:01:46 crc kubenswrapper[4744]: I0930 04:01:46.330842 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8qgkj"] Sep 30 04:01:47 crc kubenswrapper[4744]: I0930 04:01:47.513313 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" path="/var/lib/kubelet/pods/ad02b603-7a45-4c7b-a41a-0539f97e90c9/volumes" Sep 30 04:02:04 crc kubenswrapper[4744]: I0930 04:02:04.347773 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:02:04 crc kubenswrapper[4744]: I0930 04:02:04.348405 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:02:34 crc kubenswrapper[4744]: I0930 04:02:34.347988 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:02:34 crc kubenswrapper[4744]: I0930 04:02:34.350119 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:02:34 crc kubenswrapper[4744]: I0930 04:02:34.350258 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 04:02:34 crc kubenswrapper[4744]: I0930 04:02:34.351263 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77c136cce14743e0c268172e528736b39669a7a27f3fec9c8c18d04a6a426adf"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 04:02:34 crc kubenswrapper[4744]: I0930 04:02:34.351545 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://77c136cce14743e0c268172e528736b39669a7a27f3fec9c8c18d04a6a426adf" gracePeriod=600 Sep 30 04:02:35 crc kubenswrapper[4744]: I0930 04:02:35.478845 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="77c136cce14743e0c268172e528736b39669a7a27f3fec9c8c18d04a6a426adf" exitCode=0 Sep 30 04:02:35 crc kubenswrapper[4744]: I0930 04:02:35.479469 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"77c136cce14743e0c268172e528736b39669a7a27f3fec9c8c18d04a6a426adf"} Sep 30 04:02:35 crc kubenswrapper[4744]: I0930 04:02:35.479508 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee"} Sep 30 04:02:35 crc kubenswrapper[4744]: I0930 04:02:35.479532 4744 scope.go:117] "RemoveContainer" containerID="5b889c64ea2d4933e27b56c2e66ad16b22627a8426f89c3cea87d94749d19c6e" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.511760 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hbdvz"] Sep 30 04:02:52 crc kubenswrapper[4744]: E0930 04:02:52.514427 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="extract-utilities" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.514450 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="extract-utilities" Sep 30 04:02:52 crc kubenswrapper[4744]: E0930 04:02:52.514467 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="extract-content" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.514474 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="extract-content" Sep 30 04:02:52 crc kubenswrapper[4744]: E0930 04:02:52.514485 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="registry-server" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.514491 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="registry-server" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.514683 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad02b603-7a45-4c7b-a41a-0539f97e90c9" containerName="registry-server" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.515997 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.536816 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbdvz"] Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.591457 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-utilities\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.591515 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4wbl\" (UniqueName: \"kubernetes.io/projected/b51d674b-a85a-4b59-a20a-b4e7107e8640-kube-api-access-d4wbl\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.591692 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-catalog-content\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.692957 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-catalog-content\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.693085 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-utilities\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.693106 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4wbl\" (UniqueName: \"kubernetes.io/projected/b51d674b-a85a-4b59-a20a-b4e7107e8640-kube-api-access-d4wbl\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.693500 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-utilities\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.693518 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-catalog-content\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.717291 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4wbl\" (UniqueName: \"kubernetes.io/projected/b51d674b-a85a-4b59-a20a-b4e7107e8640-kube-api-access-d4wbl\") pod \"community-operators-hbdvz\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:52 crc kubenswrapper[4744]: I0930 04:02:52.865404 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:02:53 crc kubenswrapper[4744]: I0930 04:02:53.398055 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbdvz"] Sep 30 04:02:53 crc kubenswrapper[4744]: W0930 04:02:53.403506 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb51d674b_a85a_4b59_a20a_b4e7107e8640.slice/crio-113385cfe073a15b7c8e3537412fb83ea6c4b90fd1d77bc666ff2072250efbe8 WatchSource:0}: Error finding container 113385cfe073a15b7c8e3537412fb83ea6c4b90fd1d77bc666ff2072250efbe8: Status 404 returned error can't find the container with id 113385cfe073a15b7c8e3537412fb83ea6c4b90fd1d77bc666ff2072250efbe8 Sep 30 04:02:53 crc kubenswrapper[4744]: I0930 04:02:53.698186 4744 generic.go:334] "Generic (PLEG): container finished" podID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerID="261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2" exitCode=0 Sep 30 04:02:53 crc kubenswrapper[4744]: I0930 04:02:53.698250 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdvz" event={"ID":"b51d674b-a85a-4b59-a20a-b4e7107e8640","Type":"ContainerDied","Data":"261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2"} Sep 30 04:02:53 crc kubenswrapper[4744]: I0930 04:02:53.698465 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdvz" event={"ID":"b51d674b-a85a-4b59-a20a-b4e7107e8640","Type":"ContainerStarted","Data":"113385cfe073a15b7c8e3537412fb83ea6c4b90fd1d77bc666ff2072250efbe8"} Sep 30 04:02:55 crc kubenswrapper[4744]: I0930 04:02:55.721051 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdvz" event={"ID":"b51d674b-a85a-4b59-a20a-b4e7107e8640","Type":"ContainerStarted","Data":"d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331"} Sep 30 04:02:56 crc kubenswrapper[4744]: I0930 04:02:56.729086 4744 generic.go:334] "Generic (PLEG): container finished" podID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerID="d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331" exitCode=0 Sep 30 04:02:56 crc kubenswrapper[4744]: I0930 04:02:56.729173 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdvz" event={"ID":"b51d674b-a85a-4b59-a20a-b4e7107e8640","Type":"ContainerDied","Data":"d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331"} Sep 30 04:02:57 crc kubenswrapper[4744]: I0930 04:02:57.742599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdvz" event={"ID":"b51d674b-a85a-4b59-a20a-b4e7107e8640","Type":"ContainerStarted","Data":"efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc"} Sep 30 04:02:57 crc kubenswrapper[4744]: I0930 04:02:57.778884 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hbdvz" podStartSLOduration=2.376787604 podStartE2EDuration="5.778864706s" podCreationTimestamp="2025-09-30 04:02:52 +0000 UTC" firstStartedPulling="2025-09-30 04:02:53.702433694 +0000 UTC m=+4100.875653678" lastFinishedPulling="2025-09-30 04:02:57.104510796 +0000 UTC m=+4104.277730780" observedRunningTime="2025-09-30 04:02:57.764770116 +0000 UTC m=+4104.937990110" watchObservedRunningTime="2025-09-30 04:02:57.778864706 +0000 UTC m=+4104.952084680" Sep 30 04:03:02 crc kubenswrapper[4744]: I0930 04:03:02.865524 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:03:02 crc kubenswrapper[4744]: I0930 04:03:02.865944 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:03:02 crc kubenswrapper[4744]: I0930 04:03:02.926209 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:03:03 crc kubenswrapper[4744]: I0930 04:03:03.873879 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:03:03 crc kubenswrapper[4744]: I0930 04:03:03.936137 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbdvz"] Sep 30 04:03:05 crc kubenswrapper[4744]: I0930 04:03:05.817850 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hbdvz" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="registry-server" containerID="cri-o://efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc" gracePeriod=2 Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.467545 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.578934 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4wbl\" (UniqueName: \"kubernetes.io/projected/b51d674b-a85a-4b59-a20a-b4e7107e8640-kube-api-access-d4wbl\") pod \"b51d674b-a85a-4b59-a20a-b4e7107e8640\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.579002 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-catalog-content\") pod \"b51d674b-a85a-4b59-a20a-b4e7107e8640\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.579210 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-utilities\") pod \"b51d674b-a85a-4b59-a20a-b4e7107e8640\" (UID: \"b51d674b-a85a-4b59-a20a-b4e7107e8640\") " Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.581232 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-utilities" (OuterVolumeSpecName: "utilities") pod "b51d674b-a85a-4b59-a20a-b4e7107e8640" (UID: "b51d674b-a85a-4b59-a20a-b4e7107e8640"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.585386 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51d674b-a85a-4b59-a20a-b4e7107e8640-kube-api-access-d4wbl" (OuterVolumeSpecName: "kube-api-access-d4wbl") pod "b51d674b-a85a-4b59-a20a-b4e7107e8640" (UID: "b51d674b-a85a-4b59-a20a-b4e7107e8640"). InnerVolumeSpecName "kube-api-access-d4wbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.629269 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b51d674b-a85a-4b59-a20a-b4e7107e8640" (UID: "b51d674b-a85a-4b59-a20a-b4e7107e8640"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.681045 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.681083 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4wbl\" (UniqueName: \"kubernetes.io/projected/b51d674b-a85a-4b59-a20a-b4e7107e8640-kube-api-access-d4wbl\") on node \"crc\" DevicePath \"\"" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.681096 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51d674b-a85a-4b59-a20a-b4e7107e8640-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.828500 4744 generic.go:334] "Generic (PLEG): container finished" podID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerID="efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc" exitCode=0 Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.828544 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdvz" event={"ID":"b51d674b-a85a-4b59-a20a-b4e7107e8640","Type":"ContainerDied","Data":"efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc"} Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.828569 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdvz" event={"ID":"b51d674b-a85a-4b59-a20a-b4e7107e8640","Type":"ContainerDied","Data":"113385cfe073a15b7c8e3537412fb83ea6c4b90fd1d77bc666ff2072250efbe8"} Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.828587 4744 scope.go:117] "RemoveContainer" containerID="efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.828648 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdvz" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.851980 4744 scope.go:117] "RemoveContainer" containerID="d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.864672 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbdvz"] Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.872090 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hbdvz"] Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.889159 4744 scope.go:117] "RemoveContainer" containerID="261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.945817 4744 scope.go:117] "RemoveContainer" containerID="efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc" Sep 30 04:03:06 crc kubenswrapper[4744]: E0930 04:03:06.947314 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc\": container with ID starting with efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc not found: ID does not exist" containerID="efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.947386 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc"} err="failed to get container status \"efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc\": rpc error: code = NotFound desc = could not find container \"efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc\": container with ID starting with efc3b62a30c7095c1c9ab1d63f948e8d8883ea281894328b2800536ee31ba4fc not found: ID does not exist" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.947420 4744 scope.go:117] "RemoveContainer" containerID="d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331" Sep 30 04:03:06 crc kubenswrapper[4744]: E0930 04:03:06.947814 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331\": container with ID starting with d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331 not found: ID does not exist" containerID="d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.947906 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331"} err="failed to get container status \"d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331\": rpc error: code = NotFound desc = could not find container \"d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331\": container with ID starting with d6979c930ac175af2df6dcd987e083e3c730706c5cdbef9c765e1dbca41b1331 not found: ID does not exist" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.947981 4744 scope.go:117] "RemoveContainer" containerID="261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2" Sep 30 04:03:06 crc kubenswrapper[4744]: E0930 04:03:06.949052 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2\": container with ID starting with 261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2 not found: ID does not exist" containerID="261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2" Sep 30 04:03:06 crc kubenswrapper[4744]: I0930 04:03:06.949133 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2"} err="failed to get container status \"261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2\": rpc error: code = NotFound desc = could not find container \"261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2\": container with ID starting with 261d016b9ec2aaf42927f87cc01d6d2f6d0d49268cf9f4ec0b528724e4245fa2 not found: ID does not exist" Sep 30 04:03:07 crc kubenswrapper[4744]: I0930 04:03:07.520437 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" path="/var/lib/kubelet/pods/b51d674b-a85a-4b59-a20a-b4e7107e8640/volumes" Sep 30 04:04:34 crc kubenswrapper[4744]: I0930 04:04:34.347970 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:04:34 crc kubenswrapper[4744]: I0930 04:04:34.348389 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:05:04 crc kubenswrapper[4744]: I0930 04:05:04.347591 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:05:04 crc kubenswrapper[4744]: I0930 04:05:04.348042 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:05:34 crc kubenswrapper[4744]: I0930 04:05:34.348273 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:05:34 crc kubenswrapper[4744]: I0930 04:05:34.348909 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:05:34 crc kubenswrapper[4744]: I0930 04:05:34.348968 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 04:05:34 crc kubenswrapper[4744]: I0930 04:05:34.350035 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 04:05:34 crc kubenswrapper[4744]: I0930 04:05:34.350109 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" gracePeriod=600 Sep 30 04:05:34 crc kubenswrapper[4744]: E0930 04:05:34.503415 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:05:35 crc kubenswrapper[4744]: I0930 04:05:35.320754 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" exitCode=0 Sep 30 04:05:35 crc kubenswrapper[4744]: I0930 04:05:35.320801 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee"} Sep 30 04:05:35 crc kubenswrapper[4744]: I0930 04:05:35.320860 4744 scope.go:117] "RemoveContainer" containerID="77c136cce14743e0c268172e528736b39669a7a27f3fec9c8c18d04a6a426adf" Sep 30 04:05:35 crc kubenswrapper[4744]: I0930 04:05:35.321550 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:05:35 crc kubenswrapper[4744]: E0930 04:05:35.322023 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:05:48 crc kubenswrapper[4744]: I0930 04:05:48.503889 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:05:48 crc kubenswrapper[4744]: E0930 04:05:48.505142 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:06:00 crc kubenswrapper[4744]: I0930 04:06:00.504310 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:06:00 crc kubenswrapper[4744]: E0930 04:06:00.505599 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:06:13 crc kubenswrapper[4744]: I0930 04:06:13.509192 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:06:13 crc kubenswrapper[4744]: E0930 04:06:13.510490 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:06:28 crc kubenswrapper[4744]: I0930 04:06:28.503857 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:06:28 crc kubenswrapper[4744]: E0930 04:06:28.504830 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:06:39 crc kubenswrapper[4744]: I0930 04:06:39.503514 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:06:39 crc kubenswrapper[4744]: E0930 04:06:39.504179 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:06:50 crc kubenswrapper[4744]: I0930 04:06:50.503268 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:06:50 crc kubenswrapper[4744]: E0930 04:06:50.504803 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:07:03 crc kubenswrapper[4744]: I0930 04:07:03.515737 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:07:03 crc kubenswrapper[4744]: E0930 04:07:03.516712 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:07:18 crc kubenswrapper[4744]: I0930 04:07:18.504716 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:07:18 crc kubenswrapper[4744]: E0930 04:07:18.505496 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:07:30 crc kubenswrapper[4744]: I0930 04:07:30.504407 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:07:30 crc kubenswrapper[4744]: E0930 04:07:30.505653 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:07:45 crc kubenswrapper[4744]: I0930 04:07:45.504802 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:07:45 crc kubenswrapper[4744]: E0930 04:07:45.506192 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:07:56 crc kubenswrapper[4744]: I0930 04:07:56.505117 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:07:56 crc kubenswrapper[4744]: E0930 04:07:56.506182 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:08:10 crc kubenswrapper[4744]: I0930 04:08:10.509329 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:08:10 crc kubenswrapper[4744]: E0930 04:08:10.514590 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:08:25 crc kubenswrapper[4744]: I0930 04:08:25.508176 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:08:25 crc kubenswrapper[4744]: E0930 04:08:25.509058 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:08:39 crc kubenswrapper[4744]: I0930 04:08:39.503801 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:08:39 crc kubenswrapper[4744]: E0930 04:08:39.506447 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:08:50 crc kubenswrapper[4744]: I0930 04:08:50.503602 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:08:50 crc kubenswrapper[4744]: E0930 04:08:50.504412 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:09:01 crc kubenswrapper[4744]: I0930 04:09:01.505586 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:09:01 crc kubenswrapper[4744]: E0930 04:09:01.506823 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:09:15 crc kubenswrapper[4744]: I0930 04:09:15.504709 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:09:15 crc kubenswrapper[4744]: E0930 04:09:15.506201 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:09:29 crc kubenswrapper[4744]: I0930 04:09:29.503981 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:09:29 crc kubenswrapper[4744]: E0930 04:09:29.504910 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:09:44 crc kubenswrapper[4744]: I0930 04:09:44.505104 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:09:44 crc kubenswrapper[4744]: E0930 04:09:44.505965 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:09:55 crc kubenswrapper[4744]: I0930 04:09:55.504568 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:09:55 crc kubenswrapper[4744]: E0930 04:09:55.505957 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:10:09 crc kubenswrapper[4744]: I0930 04:10:09.504778 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:10:09 crc kubenswrapper[4744]: E0930 04:10:09.505696 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:10:24 crc kubenswrapper[4744]: I0930 04:10:24.503846 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:10:24 crc kubenswrapper[4744]: E0930 04:10:24.504640 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:10:36 crc kubenswrapper[4744]: I0930 04:10:36.503132 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:10:37 crc kubenswrapper[4744]: I0930 04:10:37.608587 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"84f93c804bb50a2966360c410d3376d06cb89fd915c6123ac6a23b01e8432d05"} Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.571522 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d8hrx"] Sep 30 04:11:09 crc kubenswrapper[4744]: E0930 04:11:09.572651 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="extract-content" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.572672 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="extract-content" Sep 30 04:11:09 crc kubenswrapper[4744]: E0930 04:11:09.572690 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="extract-utilities" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.572699 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="extract-utilities" Sep 30 04:11:09 crc kubenswrapper[4744]: E0930 04:11:09.572745 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="registry-server" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.572754 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="registry-server" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.573017 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51d674b-a85a-4b59-a20a-b4e7107e8640" containerName="registry-server" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.574968 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.596754 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d8hrx"] Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.734711 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-catalog-content\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.734806 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrk9l\" (UniqueName: \"kubernetes.io/projected/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-kube-api-access-zrk9l\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.734889 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-utilities\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.836692 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-catalog-content\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.836786 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrk9l\" (UniqueName: \"kubernetes.io/projected/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-kube-api-access-zrk9l\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.836870 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-utilities\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.837509 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-utilities\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.837793 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-catalog-content\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.862253 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrk9l\" (UniqueName: \"kubernetes.io/projected/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-kube-api-access-zrk9l\") pod \"redhat-operators-d8hrx\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:09 crc kubenswrapper[4744]: I0930 04:11:09.968939 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:10 crc kubenswrapper[4744]: I0930 04:11:10.446262 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d8hrx"] Sep 30 04:11:10 crc kubenswrapper[4744]: I0930 04:11:10.959990 4744 generic.go:334] "Generic (PLEG): container finished" podID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerID="c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75" exitCode=0 Sep 30 04:11:10 crc kubenswrapper[4744]: I0930 04:11:10.960070 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8hrx" event={"ID":"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4","Type":"ContainerDied","Data":"c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75"} Sep 30 04:11:10 crc kubenswrapper[4744]: I0930 04:11:10.960291 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8hrx" event={"ID":"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4","Type":"ContainerStarted","Data":"1987e577cd60ce4ecc80a8bc020f94c91906bfab5d15325b3aaa09c3fca966ab"} Sep 30 04:11:10 crc kubenswrapper[4744]: I0930 04:11:10.962264 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 04:11:12 crc kubenswrapper[4744]: I0930 04:11:12.986879 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8hrx" event={"ID":"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4","Type":"ContainerStarted","Data":"b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9"} Sep 30 04:11:14 crc kubenswrapper[4744]: I0930 04:11:14.001288 4744 generic.go:334] "Generic (PLEG): container finished" podID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerID="b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9" exitCode=0 Sep 30 04:11:14 crc kubenswrapper[4744]: I0930 04:11:14.001431 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8hrx" event={"ID":"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4","Type":"ContainerDied","Data":"b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9"} Sep 30 04:11:15 crc kubenswrapper[4744]: I0930 04:11:15.019023 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8hrx" event={"ID":"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4","Type":"ContainerStarted","Data":"e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d"} Sep 30 04:11:15 crc kubenswrapper[4744]: I0930 04:11:15.060515 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d8hrx" podStartSLOduration=2.6025376099999997 podStartE2EDuration="6.060486535s" podCreationTimestamp="2025-09-30 04:11:09 +0000 UTC" firstStartedPulling="2025-09-30 04:11:10.962047097 +0000 UTC m=+4598.135267071" lastFinishedPulling="2025-09-30 04:11:14.419996022 +0000 UTC m=+4601.593215996" observedRunningTime="2025-09-30 04:11:15.048766989 +0000 UTC m=+4602.221986973" watchObservedRunningTime="2025-09-30 04:11:15.060486535 +0000 UTC m=+4602.233706559" Sep 30 04:11:19 crc kubenswrapper[4744]: I0930 04:11:19.970171 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:19 crc kubenswrapper[4744]: I0930 04:11:19.970984 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:20 crc kubenswrapper[4744]: I0930 04:11:20.052305 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:20 crc kubenswrapper[4744]: I0930 04:11:20.137615 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:20 crc kubenswrapper[4744]: I0930 04:11:20.294651 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d8hrx"] Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.093891 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d8hrx" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="registry-server" containerID="cri-o://e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d" gracePeriod=2 Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.639116 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.830639 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrk9l\" (UniqueName: \"kubernetes.io/projected/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-kube-api-access-zrk9l\") pod \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.830790 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-utilities\") pod \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.831089 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-catalog-content\") pod \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\" (UID: \"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4\") " Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.833119 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-utilities" (OuterVolumeSpecName: "utilities") pod "9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" (UID: "9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.853078 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-kube-api-access-zrk9l" (OuterVolumeSpecName: "kube-api-access-zrk9l") pod "9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" (UID: "9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4"). InnerVolumeSpecName "kube-api-access-zrk9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.934586 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:11:22 crc kubenswrapper[4744]: I0930 04:11:22.934650 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrk9l\" (UniqueName: \"kubernetes.io/projected/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-kube-api-access-zrk9l\") on node \"crc\" DevicePath \"\"" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.108479 4744 generic.go:334] "Generic (PLEG): container finished" podID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerID="e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d" exitCode=0 Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.108599 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8hrx" event={"ID":"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4","Type":"ContainerDied","Data":"e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d"} Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.108654 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d8hrx" event={"ID":"9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4","Type":"ContainerDied","Data":"1987e577cd60ce4ecc80a8bc020f94c91906bfab5d15325b3aaa09c3fca966ab"} Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.108692 4744 scope.go:117] "RemoveContainer" containerID="e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.108951 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d8hrx" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.142673 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" (UID: "9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.156808 4744 scope.go:117] "RemoveContainer" containerID="b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.197142 4744 scope.go:117] "RemoveContainer" containerID="c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.241568 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.285082 4744 scope.go:117] "RemoveContainer" containerID="e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d" Sep 30 04:11:23 crc kubenswrapper[4744]: E0930 04:11:23.285727 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d\": container with ID starting with e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d not found: ID does not exist" containerID="e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.285775 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d"} err="failed to get container status \"e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d\": rpc error: code = NotFound desc = could not find container \"e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d\": container with ID starting with e7cfeb1d71178a87d844f7faf1d7a3be2d0696500568b530690a4ef0a0431c3d not found: ID does not exist" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.285801 4744 scope.go:117] "RemoveContainer" containerID="b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9" Sep 30 04:11:23 crc kubenswrapper[4744]: E0930 04:11:23.286519 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9\": container with ID starting with b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9 not found: ID does not exist" containerID="b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.286543 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9"} err="failed to get container status \"b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9\": rpc error: code = NotFound desc = could not find container \"b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9\": container with ID starting with b987ff446e41072cd1cc1f474a774f63adb70596a9409b0658df2667a8e0eca9 not found: ID does not exist" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.286557 4744 scope.go:117] "RemoveContainer" containerID="c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75" Sep 30 04:11:23 crc kubenswrapper[4744]: E0930 04:11:23.286969 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75\": container with ID starting with c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75 not found: ID does not exist" containerID="c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.287012 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75"} err="failed to get container status \"c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75\": rpc error: code = NotFound desc = could not find container \"c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75\": container with ID starting with c007e797160b150d52e3a800e9379ef344caa6b795b12ef87cba3ba9b8338f75 not found: ID does not exist" Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.440358 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d8hrx"] Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.448437 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d8hrx"] Sep 30 04:11:23 crc kubenswrapper[4744]: I0930 04:11:23.513942 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" path="/var/lib/kubelet/pods/9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4/volumes" Sep 30 04:11:44 crc kubenswrapper[4744]: I0930 04:11:44.861653 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vcbn7"] Sep 30 04:11:44 crc kubenswrapper[4744]: E0930 04:11:44.862580 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="extract-content" Sep 30 04:11:44 crc kubenswrapper[4744]: I0930 04:11:44.862593 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="extract-content" Sep 30 04:11:44 crc kubenswrapper[4744]: E0930 04:11:44.862638 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="extract-utilities" Sep 30 04:11:44 crc kubenswrapper[4744]: I0930 04:11:44.862645 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="extract-utilities" Sep 30 04:11:44 crc kubenswrapper[4744]: E0930 04:11:44.862660 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="registry-server" Sep 30 04:11:44 crc kubenswrapper[4744]: I0930 04:11:44.862667 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="registry-server" Sep 30 04:11:44 crc kubenswrapper[4744]: I0930 04:11:44.862858 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf0489b-7404-43f9-8aa4-91fd5a2ac0b4" containerName="registry-server" Sep 30 04:11:44 crc kubenswrapper[4744]: I0930 04:11:44.864156 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:44 crc kubenswrapper[4744]: I0930 04:11:44.892301 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vcbn7"] Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.037930 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6kft\" (UniqueName: \"kubernetes.io/projected/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-kube-api-access-x6kft\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.038134 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-catalog-content\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.038316 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-utilities\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.140392 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-catalog-content\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.140486 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-utilities\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.140616 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6kft\" (UniqueName: \"kubernetes.io/projected/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-kube-api-access-x6kft\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.141044 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-catalog-content\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.141424 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-utilities\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.253554 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6kft\" (UniqueName: \"kubernetes.io/projected/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-kube-api-access-x6kft\") pod \"certified-operators-vcbn7\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:45 crc kubenswrapper[4744]: I0930 04:11:45.503045 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:46 crc kubenswrapper[4744]: I0930 04:11:46.099422 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vcbn7"] Sep 30 04:11:46 crc kubenswrapper[4744]: I0930 04:11:46.372049 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerStarted","Data":"21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608"} Sep 30 04:11:46 crc kubenswrapper[4744]: I0930 04:11:46.372097 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerStarted","Data":"1bbb158511bc219a0cc06d164ec651b027634cac69daafbe990bbfb79c8e54e3"} Sep 30 04:11:47 crc kubenswrapper[4744]: I0930 04:11:47.386990 4744 generic.go:334] "Generic (PLEG): container finished" podID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerID="21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608" exitCode=0 Sep 30 04:11:47 crc kubenswrapper[4744]: I0930 04:11:47.387109 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerDied","Data":"21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608"} Sep 30 04:11:48 crc kubenswrapper[4744]: I0930 04:11:48.403630 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerStarted","Data":"502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3"} Sep 30 04:11:49 crc kubenswrapper[4744]: I0930 04:11:49.416579 4744 generic.go:334] "Generic (PLEG): container finished" podID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerID="502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3" exitCode=0 Sep 30 04:11:49 crc kubenswrapper[4744]: I0930 04:11:49.416649 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerDied","Data":"502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3"} Sep 30 04:11:50 crc kubenswrapper[4744]: I0930 04:11:50.434800 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerStarted","Data":"985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77"} Sep 30 04:11:50 crc kubenswrapper[4744]: I0930 04:11:50.472708 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vcbn7" podStartSLOduration=3.687156937 podStartE2EDuration="6.472688271s" podCreationTimestamp="2025-09-30 04:11:44 +0000 UTC" firstStartedPulling="2025-09-30 04:11:47.389753142 +0000 UTC m=+4634.562973126" lastFinishedPulling="2025-09-30 04:11:50.175284476 +0000 UTC m=+4637.348504460" observedRunningTime="2025-09-30 04:11:50.459407127 +0000 UTC m=+4637.632627111" watchObservedRunningTime="2025-09-30 04:11:50.472688271 +0000 UTC m=+4637.645908255" Sep 30 04:11:55 crc kubenswrapper[4744]: I0930 04:11:55.527523 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:55 crc kubenswrapper[4744]: I0930 04:11:55.528447 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:55 crc kubenswrapper[4744]: I0930 04:11:55.574835 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:56 crc kubenswrapper[4744]: I0930 04:11:56.554202 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:56 crc kubenswrapper[4744]: I0930 04:11:56.616445 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vcbn7"] Sep 30 04:11:58 crc kubenswrapper[4744]: I0930 04:11:58.528217 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vcbn7" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="registry-server" containerID="cri-o://985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77" gracePeriod=2 Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.050485 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.208550 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-catalog-content\") pod \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.208751 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-utilities\") pod \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.208899 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6kft\" (UniqueName: \"kubernetes.io/projected/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-kube-api-access-x6kft\") pod \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\" (UID: \"bc0bfdf3-744d-4737-b414-d5e49ef29ec5\") " Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.209937 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-utilities" (OuterVolumeSpecName: "utilities") pod "bc0bfdf3-744d-4737-b414-d5e49ef29ec5" (UID: "bc0bfdf3-744d-4737-b414-d5e49ef29ec5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.220440 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-kube-api-access-x6kft" (OuterVolumeSpecName: "kube-api-access-x6kft") pod "bc0bfdf3-744d-4737-b414-d5e49ef29ec5" (UID: "bc0bfdf3-744d-4737-b414-d5e49ef29ec5"). InnerVolumeSpecName "kube-api-access-x6kft". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.311314 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.311585 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6kft\" (UniqueName: \"kubernetes.io/projected/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-kube-api-access-x6kft\") on node \"crc\" DevicePath \"\"" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.366160 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc0bfdf3-744d-4737-b414-d5e49ef29ec5" (UID: "bc0bfdf3-744d-4737-b414-d5e49ef29ec5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.412492 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0bfdf3-744d-4737-b414-d5e49ef29ec5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.544814 4744 generic.go:334] "Generic (PLEG): container finished" podID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerID="985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77" exitCode=0 Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.544867 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerDied","Data":"985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77"} Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.544917 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vcbn7" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.544940 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vcbn7" event={"ID":"bc0bfdf3-744d-4737-b414-d5e49ef29ec5","Type":"ContainerDied","Data":"1bbb158511bc219a0cc06d164ec651b027634cac69daafbe990bbfb79c8e54e3"} Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.544961 4744 scope.go:117] "RemoveContainer" containerID="985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.580421 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vcbn7"] Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.588196 4744 scope.go:117] "RemoveContainer" containerID="502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.589556 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vcbn7"] Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.628381 4744 scope.go:117] "RemoveContainer" containerID="21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.666758 4744 scope.go:117] "RemoveContainer" containerID="985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77" Sep 30 04:11:59 crc kubenswrapper[4744]: E0930 04:11:59.667345 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77\": container with ID starting with 985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77 not found: ID does not exist" containerID="985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.667420 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77"} err="failed to get container status \"985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77\": rpc error: code = NotFound desc = could not find container \"985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77\": container with ID starting with 985e32f1ce817299655c04b3f53b9897848a1022b78207a05ebe2259785b2b77 not found: ID does not exist" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.667451 4744 scope.go:117] "RemoveContainer" containerID="502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3" Sep 30 04:11:59 crc kubenswrapper[4744]: E0930 04:11:59.667918 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3\": container with ID starting with 502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3 not found: ID does not exist" containerID="502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.668101 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3"} err="failed to get container status \"502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3\": rpc error: code = NotFound desc = could not find container \"502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3\": container with ID starting with 502f269dc903747a5ceb8b56c51e2ad0091e26ac30cd37d7a8c87269cb40c4e3 not found: ID does not exist" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.668316 4744 scope.go:117] "RemoveContainer" containerID="21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608" Sep 30 04:11:59 crc kubenswrapper[4744]: E0930 04:11:59.668905 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608\": container with ID starting with 21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608 not found: ID does not exist" containerID="21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608" Sep 30 04:11:59 crc kubenswrapper[4744]: I0930 04:11:59.668937 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608"} err="failed to get container status \"21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608\": rpc error: code = NotFound desc = could not find container \"21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608\": container with ID starting with 21d2ddc24b811f33ede755a467c6c4a478cf9d9a7b48386b694e10e7ba84e608 not found: ID does not exist" Sep 30 04:12:01 crc kubenswrapper[4744]: I0930 04:12:01.523077 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" path="/var/lib/kubelet/pods/bc0bfdf3-744d-4737-b414-d5e49ef29ec5/volumes" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.075592 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ggvnj"] Sep 30 04:13:04 crc kubenswrapper[4744]: E0930 04:13:04.076882 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="extract-content" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.076903 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="extract-content" Sep 30 04:13:04 crc kubenswrapper[4744]: E0930 04:13:04.076923 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="registry-server" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.076932 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="registry-server" Sep 30 04:13:04 crc kubenswrapper[4744]: E0930 04:13:04.076960 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="extract-utilities" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.076970 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="extract-utilities" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.077230 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0bfdf3-744d-4737-b414-d5e49ef29ec5" containerName="registry-server" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.079119 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.106294 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggvnj"] Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.136764 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nzwm\" (UniqueName: \"kubernetes.io/projected/beed1d71-7272-418c-aad4-a0c2bc779c7d-kube-api-access-5nzwm\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.136895 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-catalog-content\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.136940 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-utilities\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.240027 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nzwm\" (UniqueName: \"kubernetes.io/projected/beed1d71-7272-418c-aad4-a0c2bc779c7d-kube-api-access-5nzwm\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.240383 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-catalog-content\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.240407 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-utilities\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.240822 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-utilities\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.241420 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-catalog-content\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.268278 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nzwm\" (UniqueName: \"kubernetes.io/projected/beed1d71-7272-418c-aad4-a0c2bc779c7d-kube-api-access-5nzwm\") pod \"community-operators-ggvnj\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.348349 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.348448 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.416969 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:04 crc kubenswrapper[4744]: I0930 04:13:04.996596 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggvnj"] Sep 30 04:13:05 crc kubenswrapper[4744]: I0930 04:13:05.333590 4744 generic.go:334] "Generic (PLEG): container finished" podID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerID="78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230" exitCode=0 Sep 30 04:13:05 crc kubenswrapper[4744]: I0930 04:13:05.333710 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggvnj" event={"ID":"beed1d71-7272-418c-aad4-a0c2bc779c7d","Type":"ContainerDied","Data":"78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230"} Sep 30 04:13:05 crc kubenswrapper[4744]: I0930 04:13:05.334003 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggvnj" event={"ID":"beed1d71-7272-418c-aad4-a0c2bc779c7d","Type":"ContainerStarted","Data":"368913ccd857c197a4923c314f2a1d5fcf57a748d324143f6865e1fab58cb508"} Sep 30 04:13:06 crc kubenswrapper[4744]: I0930 04:13:06.342403 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggvnj" event={"ID":"beed1d71-7272-418c-aad4-a0c2bc779c7d","Type":"ContainerStarted","Data":"26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1"} Sep 30 04:13:07 crc kubenswrapper[4744]: I0930 04:13:07.356695 4744 generic.go:334] "Generic (PLEG): container finished" podID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerID="26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1" exitCode=0 Sep 30 04:13:07 crc kubenswrapper[4744]: I0930 04:13:07.357570 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggvnj" event={"ID":"beed1d71-7272-418c-aad4-a0c2bc779c7d","Type":"ContainerDied","Data":"26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1"} Sep 30 04:13:08 crc kubenswrapper[4744]: I0930 04:13:08.373897 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggvnj" event={"ID":"beed1d71-7272-418c-aad4-a0c2bc779c7d","Type":"ContainerStarted","Data":"b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7"} Sep 30 04:13:08 crc kubenswrapper[4744]: I0930 04:13:08.406073 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ggvnj" podStartSLOduration=1.937416506 podStartE2EDuration="4.406048928s" podCreationTimestamp="2025-09-30 04:13:04 +0000 UTC" firstStartedPulling="2025-09-30 04:13:05.335708652 +0000 UTC m=+4712.508928626" lastFinishedPulling="2025-09-30 04:13:07.804341064 +0000 UTC m=+4714.977561048" observedRunningTime="2025-09-30 04:13:08.401805795 +0000 UTC m=+4715.575025779" watchObservedRunningTime="2025-09-30 04:13:08.406048928 +0000 UTC m=+4715.579268912" Sep 30 04:13:14 crc kubenswrapper[4744]: I0930 04:13:14.417966 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:14 crc kubenswrapper[4744]: I0930 04:13:14.418589 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:14 crc kubenswrapper[4744]: I0930 04:13:14.510730 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:14 crc kubenswrapper[4744]: I0930 04:13:14.586186 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:14 crc kubenswrapper[4744]: I0930 04:13:14.762460 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggvnj"] Sep 30 04:13:16 crc kubenswrapper[4744]: I0930 04:13:16.467145 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ggvnj" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="registry-server" containerID="cri-o://b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7" gracePeriod=2 Sep 30 04:13:16 crc kubenswrapper[4744]: I0930 04:13:16.960491 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.126126 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-utilities\") pod \"beed1d71-7272-418c-aad4-a0c2bc779c7d\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.126340 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-catalog-content\") pod \"beed1d71-7272-418c-aad4-a0c2bc779c7d\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.126541 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nzwm\" (UniqueName: \"kubernetes.io/projected/beed1d71-7272-418c-aad4-a0c2bc779c7d-kube-api-access-5nzwm\") pod \"beed1d71-7272-418c-aad4-a0c2bc779c7d\" (UID: \"beed1d71-7272-418c-aad4-a0c2bc779c7d\") " Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.127913 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-utilities" (OuterVolumeSpecName: "utilities") pod "beed1d71-7272-418c-aad4-a0c2bc779c7d" (UID: "beed1d71-7272-418c-aad4-a0c2bc779c7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.128621 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.141777 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beed1d71-7272-418c-aad4-a0c2bc779c7d-kube-api-access-5nzwm" (OuterVolumeSpecName: "kube-api-access-5nzwm") pod "beed1d71-7272-418c-aad4-a0c2bc779c7d" (UID: "beed1d71-7272-418c-aad4-a0c2bc779c7d"). InnerVolumeSpecName "kube-api-access-5nzwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.230949 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nzwm\" (UniqueName: \"kubernetes.io/projected/beed1d71-7272-418c-aad4-a0c2bc779c7d-kube-api-access-5nzwm\") on node \"crc\" DevicePath \"\"" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.487746 4744 generic.go:334] "Generic (PLEG): container finished" podID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerID="b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7" exitCode=0 Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.487812 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggvnj" event={"ID":"beed1d71-7272-418c-aad4-a0c2bc779c7d","Type":"ContainerDied","Data":"b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7"} Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.487854 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggvnj" event={"ID":"beed1d71-7272-418c-aad4-a0c2bc779c7d","Type":"ContainerDied","Data":"368913ccd857c197a4923c314f2a1d5fcf57a748d324143f6865e1fab58cb508"} Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.487887 4744 scope.go:117] "RemoveContainer" containerID="b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.488077 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggvnj" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.514302 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "beed1d71-7272-418c-aad4-a0c2bc779c7d" (UID: "beed1d71-7272-418c-aad4-a0c2bc779c7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.534148 4744 scope.go:117] "RemoveContainer" containerID="26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.540083 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beed1d71-7272-418c-aad4-a0c2bc779c7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.565148 4744 scope.go:117] "RemoveContainer" containerID="78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.606909 4744 scope.go:117] "RemoveContainer" containerID="b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7" Sep 30 04:13:17 crc kubenswrapper[4744]: E0930 04:13:17.607457 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7\": container with ID starting with b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7 not found: ID does not exist" containerID="b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.607514 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7"} err="failed to get container status \"b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7\": rpc error: code = NotFound desc = could not find container \"b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7\": container with ID starting with b2e41fd4698d932696ef55e1b5f344617e2b9aeff5b7da9bdd760a98f94094b7 not found: ID does not exist" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.607539 4744 scope.go:117] "RemoveContainer" containerID="26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1" Sep 30 04:13:17 crc kubenswrapper[4744]: E0930 04:13:17.608087 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1\": container with ID starting with 26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1 not found: ID does not exist" containerID="26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.608131 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1"} err="failed to get container status \"26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1\": rpc error: code = NotFound desc = could not find container \"26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1\": container with ID starting with 26e511a02c5ad97c26293dabe0dbe9423f13407cda6b28daf1c739777c1f94e1 not found: ID does not exist" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.608161 4744 scope.go:117] "RemoveContainer" containerID="78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230" Sep 30 04:13:17 crc kubenswrapper[4744]: E0930 04:13:17.608612 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230\": container with ID starting with 78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230 not found: ID does not exist" containerID="78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.608642 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230"} err="failed to get container status \"78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230\": rpc error: code = NotFound desc = could not find container \"78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230\": container with ID starting with 78887750cd7162d9f5ef02cd5a27b134e045aa10e870411acd38128a7cbe4230 not found: ID does not exist" Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.818447 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggvnj"] Sep 30 04:13:17 crc kubenswrapper[4744]: I0930 04:13:17.832588 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ggvnj"] Sep 30 04:13:19 crc kubenswrapper[4744]: I0930 04:13:19.525048 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" path="/var/lib/kubelet/pods/beed1d71-7272-418c-aad4-a0c2bc779c7d/volumes" Sep 30 04:13:34 crc kubenswrapper[4744]: I0930 04:13:34.347982 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:13:34 crc kubenswrapper[4744]: I0930 04:13:34.349792 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.181856 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6shvh"] Sep 30 04:13:37 crc kubenswrapper[4744]: E0930 04:13:37.182882 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="registry-server" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.182904 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="registry-server" Sep 30 04:13:37 crc kubenswrapper[4744]: E0930 04:13:37.182937 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="extract-utilities" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.182951 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="extract-utilities" Sep 30 04:13:37 crc kubenswrapper[4744]: E0930 04:13:37.182992 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="extract-content" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.183006 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="extract-content" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.183501 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="beed1d71-7272-418c-aad4-a0c2bc779c7d" containerName="registry-server" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.186086 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.199728 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6shvh"] Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.370123 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-catalog-content\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.370193 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66l9\" (UniqueName: \"kubernetes.io/projected/b7890aa7-1498-4104-9f85-51948708a595-kube-api-access-t66l9\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.370321 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-utilities\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.472218 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-utilities\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.472401 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-catalog-content\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.472453 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66l9\" (UniqueName: \"kubernetes.io/projected/b7890aa7-1498-4104-9f85-51948708a595-kube-api-access-t66l9\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.473017 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-catalog-content\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.473014 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-utilities\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.507252 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66l9\" (UniqueName: \"kubernetes.io/projected/b7890aa7-1498-4104-9f85-51948708a595-kube-api-access-t66l9\") pod \"redhat-marketplace-6shvh\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:37 crc kubenswrapper[4744]: I0930 04:13:37.539685 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:38 crc kubenswrapper[4744]: I0930 04:13:38.003799 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6shvh"] Sep 30 04:13:38 crc kubenswrapper[4744]: I0930 04:13:38.752180 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7890aa7-1498-4104-9f85-51948708a595" containerID="d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4" exitCode=0 Sep 30 04:13:38 crc kubenswrapper[4744]: I0930 04:13:38.752336 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6shvh" event={"ID":"b7890aa7-1498-4104-9f85-51948708a595","Type":"ContainerDied","Data":"d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4"} Sep 30 04:13:38 crc kubenswrapper[4744]: I0930 04:13:38.752566 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6shvh" event={"ID":"b7890aa7-1498-4104-9f85-51948708a595","Type":"ContainerStarted","Data":"e405dbdea2c31e980efeed1862b77c93e49de6e38c7dc8a63a0930621f8da705"} Sep 30 04:13:39 crc kubenswrapper[4744]: I0930 04:13:39.766057 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6shvh" event={"ID":"b7890aa7-1498-4104-9f85-51948708a595","Type":"ContainerStarted","Data":"2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2"} Sep 30 04:13:40 crc kubenswrapper[4744]: I0930 04:13:40.783031 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7890aa7-1498-4104-9f85-51948708a595" containerID="2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2" exitCode=0 Sep 30 04:13:40 crc kubenswrapper[4744]: I0930 04:13:40.783088 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6shvh" event={"ID":"b7890aa7-1498-4104-9f85-51948708a595","Type":"ContainerDied","Data":"2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2"} Sep 30 04:13:41 crc kubenswrapper[4744]: I0930 04:13:41.796622 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6shvh" event={"ID":"b7890aa7-1498-4104-9f85-51948708a595","Type":"ContainerStarted","Data":"787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2"} Sep 30 04:13:41 crc kubenswrapper[4744]: I0930 04:13:41.812063 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6shvh" podStartSLOduration=2.263450426 podStartE2EDuration="4.812045752s" podCreationTimestamp="2025-09-30 04:13:37 +0000 UTC" firstStartedPulling="2025-09-30 04:13:38.754406421 +0000 UTC m=+4745.927626435" lastFinishedPulling="2025-09-30 04:13:41.303001747 +0000 UTC m=+4748.476221761" observedRunningTime="2025-09-30 04:13:41.81198035 +0000 UTC m=+4748.985200394" watchObservedRunningTime="2025-09-30 04:13:41.812045752 +0000 UTC m=+4748.985265726" Sep 30 04:13:47 crc kubenswrapper[4744]: I0930 04:13:47.539816 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:47 crc kubenswrapper[4744]: I0930 04:13:47.540357 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:47 crc kubenswrapper[4744]: I0930 04:13:47.595624 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:47 crc kubenswrapper[4744]: I0930 04:13:47.935907 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:47 crc kubenswrapper[4744]: I0930 04:13:47.983519 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6shvh"] Sep 30 04:13:49 crc kubenswrapper[4744]: I0930 04:13:49.886819 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6shvh" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="registry-server" containerID="cri-o://787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2" gracePeriod=2 Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.448852 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.555355 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-catalog-content\") pod \"b7890aa7-1498-4104-9f85-51948708a595\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.555610 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-utilities\") pod \"b7890aa7-1498-4104-9f85-51948708a595\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.555794 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66l9\" (UniqueName: \"kubernetes.io/projected/b7890aa7-1498-4104-9f85-51948708a595-kube-api-access-t66l9\") pod \"b7890aa7-1498-4104-9f85-51948708a595\" (UID: \"b7890aa7-1498-4104-9f85-51948708a595\") " Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.556477 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-utilities" (OuterVolumeSpecName: "utilities") pod "b7890aa7-1498-4104-9f85-51948708a595" (UID: "b7890aa7-1498-4104-9f85-51948708a595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.562118 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7890aa7-1498-4104-9f85-51948708a595-kube-api-access-t66l9" (OuterVolumeSpecName: "kube-api-access-t66l9") pod "b7890aa7-1498-4104-9f85-51948708a595" (UID: "b7890aa7-1498-4104-9f85-51948708a595"). InnerVolumeSpecName "kube-api-access-t66l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.572870 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7890aa7-1498-4104-9f85-51948708a595" (UID: "b7890aa7-1498-4104-9f85-51948708a595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.658319 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66l9\" (UniqueName: \"kubernetes.io/projected/b7890aa7-1498-4104-9f85-51948708a595-kube-api-access-t66l9\") on node \"crc\" DevicePath \"\"" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.658700 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.658722 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7890aa7-1498-4104-9f85-51948708a595-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.902189 4744 generic.go:334] "Generic (PLEG): container finished" podID="b7890aa7-1498-4104-9f85-51948708a595" containerID="787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2" exitCode=0 Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.902260 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6shvh" event={"ID":"b7890aa7-1498-4104-9f85-51948708a595","Type":"ContainerDied","Data":"787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2"} Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.902305 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6shvh" event={"ID":"b7890aa7-1498-4104-9f85-51948708a595","Type":"ContainerDied","Data":"e405dbdea2c31e980efeed1862b77c93e49de6e38c7dc8a63a0930621f8da705"} Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.902333 4744 scope.go:117] "RemoveContainer" containerID="787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.902580 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6shvh" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.944715 4744 scope.go:117] "RemoveContainer" containerID="2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2" Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.966238 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6shvh"] Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.979043 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6shvh"] Sep 30 04:13:50 crc kubenswrapper[4744]: I0930 04:13:50.996151 4744 scope.go:117] "RemoveContainer" containerID="d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4" Sep 30 04:13:51 crc kubenswrapper[4744]: I0930 04:13:51.062335 4744 scope.go:117] "RemoveContainer" containerID="787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2" Sep 30 04:13:51 crc kubenswrapper[4744]: E0930 04:13:51.062857 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2\": container with ID starting with 787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2 not found: ID does not exist" containerID="787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2" Sep 30 04:13:51 crc kubenswrapper[4744]: I0930 04:13:51.062908 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2"} err="failed to get container status \"787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2\": rpc error: code = NotFound desc = could not find container \"787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2\": container with ID starting with 787eeaa685e3df065fc5a8810cbb6539d20ab241dbe0cbc2c6aa448a16d78ef2 not found: ID does not exist" Sep 30 04:13:51 crc kubenswrapper[4744]: I0930 04:13:51.062943 4744 scope.go:117] "RemoveContainer" containerID="2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2" Sep 30 04:13:51 crc kubenswrapper[4744]: E0930 04:13:51.063321 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2\": container with ID starting with 2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2 not found: ID does not exist" containerID="2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2" Sep 30 04:13:51 crc kubenswrapper[4744]: I0930 04:13:51.063388 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2"} err="failed to get container status \"2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2\": rpc error: code = NotFound desc = could not find container \"2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2\": container with ID starting with 2f872b23bd65b1038a1eac56034bfb236dd1f82a556b859583735001b746a5f2 not found: ID does not exist" Sep 30 04:13:51 crc kubenswrapper[4744]: I0930 04:13:51.063423 4744 scope.go:117] "RemoveContainer" containerID="d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4" Sep 30 04:13:51 crc kubenswrapper[4744]: E0930 04:13:51.063766 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4\": container with ID starting with d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4 not found: ID does not exist" containerID="d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4" Sep 30 04:13:51 crc kubenswrapper[4744]: I0930 04:13:51.063808 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4"} err="failed to get container status \"d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4\": rpc error: code = NotFound desc = could not find container \"d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4\": container with ID starting with d157d07a6200e13683975a891c7973f87dce60ac1e8389c7ae0b5517631ca5a4 not found: ID does not exist" Sep 30 04:13:51 crc kubenswrapper[4744]: I0930 04:13:51.524189 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7890aa7-1498-4104-9f85-51948708a595" path="/var/lib/kubelet/pods/b7890aa7-1498-4104-9f85-51948708a595/volumes" Sep 30 04:14:04 crc kubenswrapper[4744]: I0930 04:14:04.348815 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:14:04 crc kubenswrapper[4744]: I0930 04:14:04.350011 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:14:04 crc kubenswrapper[4744]: I0930 04:14:04.350094 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 04:14:04 crc kubenswrapper[4744]: I0930 04:14:04.352692 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84f93c804bb50a2966360c410d3376d06cb89fd915c6123ac6a23b01e8432d05"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 04:14:04 crc kubenswrapper[4744]: I0930 04:14:04.352802 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://84f93c804bb50a2966360c410d3376d06cb89fd915c6123ac6a23b01e8432d05" gracePeriod=600 Sep 30 04:14:05 crc kubenswrapper[4744]: I0930 04:14:05.104729 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="84f93c804bb50a2966360c410d3376d06cb89fd915c6123ac6a23b01e8432d05" exitCode=0 Sep 30 04:14:05 crc kubenswrapper[4744]: I0930 04:14:05.104788 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"84f93c804bb50a2966360c410d3376d06cb89fd915c6123ac6a23b01e8432d05"} Sep 30 04:14:05 crc kubenswrapper[4744]: I0930 04:14:05.105107 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1"} Sep 30 04:14:05 crc kubenswrapper[4744]: I0930 04:14:05.105137 4744 scope.go:117] "RemoveContainer" containerID="cc312f5c4ae18356903768ba50bcc086436eeea33b598452c8500630f1e6c4ee" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.168854 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v"] Sep 30 04:15:00 crc kubenswrapper[4744]: E0930 04:15:00.170022 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="extract-utilities" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.170041 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="extract-utilities" Sep 30 04:15:00 crc kubenswrapper[4744]: E0930 04:15:00.170061 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="extract-content" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.170069 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="extract-content" Sep 30 04:15:00 crc kubenswrapper[4744]: E0930 04:15:00.170117 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="registry-server" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.170124 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="registry-server" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.170341 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7890aa7-1498-4104-9f85-51948708a595" containerName="registry-server" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.171314 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.175492 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.175711 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.181838 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v"] Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.275483 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822e65c8-bc12-42e3-94fc-83df0419ae4f-config-volume\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.275773 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822e65c8-bc12-42e3-94fc-83df0419ae4f-secret-volume\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.275861 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs7lt\" (UniqueName: \"kubernetes.io/projected/822e65c8-bc12-42e3-94fc-83df0419ae4f-kube-api-access-vs7lt\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.378079 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822e65c8-bc12-42e3-94fc-83df0419ae4f-config-volume\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.378194 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822e65c8-bc12-42e3-94fc-83df0419ae4f-secret-volume\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.378225 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs7lt\" (UniqueName: \"kubernetes.io/projected/822e65c8-bc12-42e3-94fc-83df0419ae4f-kube-api-access-vs7lt\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.379989 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822e65c8-bc12-42e3-94fc-83df0419ae4f-config-volume\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.389808 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822e65c8-bc12-42e3-94fc-83df0419ae4f-secret-volume\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.410768 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs7lt\" (UniqueName: \"kubernetes.io/projected/822e65c8-bc12-42e3-94fc-83df0419ae4f-kube-api-access-vs7lt\") pod \"collect-profiles-29320095-5nh9v\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:00 crc kubenswrapper[4744]: I0930 04:15:00.500460 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:01 crc kubenswrapper[4744]: I0930 04:15:01.003751 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v"] Sep 30 04:15:01 crc kubenswrapper[4744]: I0930 04:15:01.733159 4744 generic.go:334] "Generic (PLEG): container finished" podID="822e65c8-bc12-42e3-94fc-83df0419ae4f" containerID="963c49204860adbf50a22c140eb345bb44f8e20435d4e513cd40e9f53e397a85" exitCode=0 Sep 30 04:15:01 crc kubenswrapper[4744]: I0930 04:15:01.733199 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" event={"ID":"822e65c8-bc12-42e3-94fc-83df0419ae4f","Type":"ContainerDied","Data":"963c49204860adbf50a22c140eb345bb44f8e20435d4e513cd40e9f53e397a85"} Sep 30 04:15:01 crc kubenswrapper[4744]: I0930 04:15:01.733419 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" event={"ID":"822e65c8-bc12-42e3-94fc-83df0419ae4f","Type":"ContainerStarted","Data":"6436e819927528dc93f58c52e666e0d7b3c45e029cf21ae822bc237de663d4f8"} Sep 30 04:15:03 crc kubenswrapper[4744]: I0930 04:15:03.979481 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.059845 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822e65c8-bc12-42e3-94fc-83df0419ae4f-secret-volume\") pod \"822e65c8-bc12-42e3-94fc-83df0419ae4f\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.060055 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822e65c8-bc12-42e3-94fc-83df0419ae4f-config-volume\") pod \"822e65c8-bc12-42e3-94fc-83df0419ae4f\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.060137 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs7lt\" (UniqueName: \"kubernetes.io/projected/822e65c8-bc12-42e3-94fc-83df0419ae4f-kube-api-access-vs7lt\") pod \"822e65c8-bc12-42e3-94fc-83df0419ae4f\" (UID: \"822e65c8-bc12-42e3-94fc-83df0419ae4f\") " Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.061916 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/822e65c8-bc12-42e3-94fc-83df0419ae4f-config-volume" (OuterVolumeSpecName: "config-volume") pod "822e65c8-bc12-42e3-94fc-83df0419ae4f" (UID: "822e65c8-bc12-42e3-94fc-83df0419ae4f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.067380 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822e65c8-bc12-42e3-94fc-83df0419ae4f-kube-api-access-vs7lt" (OuterVolumeSpecName: "kube-api-access-vs7lt") pod "822e65c8-bc12-42e3-94fc-83df0419ae4f" (UID: "822e65c8-bc12-42e3-94fc-83df0419ae4f"). InnerVolumeSpecName "kube-api-access-vs7lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.072540 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822e65c8-bc12-42e3-94fc-83df0419ae4f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "822e65c8-bc12-42e3-94fc-83df0419ae4f" (UID: "822e65c8-bc12-42e3-94fc-83df0419ae4f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.170482 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/822e65c8-bc12-42e3-94fc-83df0419ae4f-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.170537 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs7lt\" (UniqueName: \"kubernetes.io/projected/822e65c8-bc12-42e3-94fc-83df0419ae4f-kube-api-access-vs7lt\") on node \"crc\" DevicePath \"\"" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.170552 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/822e65c8-bc12-42e3-94fc-83df0419ae4f-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.769583 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" event={"ID":"822e65c8-bc12-42e3-94fc-83df0419ae4f","Type":"ContainerDied","Data":"6436e819927528dc93f58c52e666e0d7b3c45e029cf21ae822bc237de663d4f8"} Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.769946 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6436e819927528dc93f58c52e666e0d7b3c45e029cf21ae822bc237de663d4f8" Sep 30 04:15:04 crc kubenswrapper[4744]: I0930 04:15:04.769672 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320095-5nh9v" Sep 30 04:15:05 crc kubenswrapper[4744]: I0930 04:15:05.099480 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl"] Sep 30 04:15:05 crc kubenswrapper[4744]: I0930 04:15:05.109774 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320050-4d6nl"] Sep 30 04:15:05 crc kubenswrapper[4744]: I0930 04:15:05.528346 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30471e6c-5068-46d1-a5b7-48d29949d489" path="/var/lib/kubelet/pods/30471e6c-5068-46d1-a5b7-48d29949d489/volumes" Sep 30 04:15:43 crc kubenswrapper[4744]: I0930 04:15:43.133076 4744 scope.go:117] "RemoveContainer" containerID="690c653dad5d3629c41f033b96a26b8e3bb1386d9c9b300428f4069f0b102a5a" Sep 30 04:16:04 crc kubenswrapper[4744]: I0930 04:16:04.347932 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:16:04 crc kubenswrapper[4744]: I0930 04:16:04.348638 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:16:25 crc kubenswrapper[4744]: I0930 04:16:25.687594 4744 generic.go:334] "Generic (PLEG): container finished" podID="f4a78f7a-b5bc-4636-81df-578f5105bce3" containerID="9d6e78a3adde425ff4cc4f71705b23a45577f722325d56f667bde6d003694c72" exitCode=0 Sep 30 04:16:25 crc kubenswrapper[4744]: I0930 04:16:25.687698 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f4a78f7a-b5bc-4636-81df-578f5105bce3","Type":"ContainerDied","Data":"9d6e78a3adde425ff4cc4f71705b23a45577f722325d56f667bde6d003694c72"} Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.119590 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.169651 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.169745 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4j74\" (UniqueName: \"kubernetes.io/projected/f4a78f7a-b5bc-4636-81df-578f5105bce3-kube-api-access-q4j74\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.169794 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-config-data\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.169896 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.169953 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-workdir\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.170234 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config-secret\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.170316 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ssh-key\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.170348 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ca-certs\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.170464 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-temporary\") pod \"f4a78f7a-b5bc-4636-81df-578f5105bce3\" (UID: \"f4a78f7a-b5bc-4636-81df-578f5105bce3\") " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.180779 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-config-data" (OuterVolumeSpecName: "config-data") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.180889 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.180964 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.192815 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.194814 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a78f7a-b5bc-4636-81df-578f5105bce3-kube-api-access-q4j74" (OuterVolumeSpecName: "kube-api-access-q4j74") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "kube-api-access-q4j74". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.214861 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.228614 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.247566 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.254037 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f4a78f7a-b5bc-4636-81df-578f5105bce3" (UID: "f4a78f7a-b5bc-4636-81df-578f5105bce3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271785 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271820 4744 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271830 4744 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f4a78f7a-b5bc-4636-81df-578f5105bce3-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271839 4744 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271850 4744 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271859 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4j74\" (UniqueName: \"kubernetes.io/projected/f4a78f7a-b5bc-4636-81df-578f5105bce3-kube-api-access-q4j74\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271869 4744 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4a78f7a-b5bc-4636-81df-578f5105bce3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271903 4744 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.271915 4744 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f4a78f7a-b5bc-4636-81df-578f5105bce3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.305344 4744 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.374455 4744 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.717604 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f4a78f7a-b5bc-4636-81df-578f5105bce3","Type":"ContainerDied","Data":"c04e72a9d85adccd91c461b58cf2314b2ee6e7522adf0aa0f869e42eaa1d2f28"} Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.717664 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c04e72a9d85adccd91c461b58cf2314b2ee6e7522adf0aa0f869e42eaa1d2f28" Sep 30 04:16:27 crc kubenswrapper[4744]: I0930 04:16:27.717750 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.289955 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 04:16:31 crc kubenswrapper[4744]: E0930 04:16:31.291219 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822e65c8-bc12-42e3-94fc-83df0419ae4f" containerName="collect-profiles" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.291241 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="822e65c8-bc12-42e3-94fc-83df0419ae4f" containerName="collect-profiles" Sep 30 04:16:31 crc kubenswrapper[4744]: E0930 04:16:31.291279 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a78f7a-b5bc-4636-81df-578f5105bce3" containerName="tempest-tests-tempest-tests-runner" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.291292 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a78f7a-b5bc-4636-81df-578f5105bce3" containerName="tempest-tests-tempest-tests-runner" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.291663 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a78f7a-b5bc-4636-81df-578f5105bce3" containerName="tempest-tests-tempest-tests-runner" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.291686 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="822e65c8-bc12-42e3-94fc-83df0419ae4f" containerName="collect-profiles" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.292853 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.315103 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.386130 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkmhv\" (UniqueName: \"kubernetes.io/projected/6fff3c05-b002-4910-8909-665295c5d940-kube-api-access-rkmhv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6fff3c05-b002-4910-8909-665295c5d940\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.386200 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6fff3c05-b002-4910-8909-665295c5d940\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.489297 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkmhv\" (UniqueName: \"kubernetes.io/projected/6fff3c05-b002-4910-8909-665295c5d940-kube-api-access-rkmhv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6fff3c05-b002-4910-8909-665295c5d940\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.489424 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6fff3c05-b002-4910-8909-665295c5d940\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.489876 4744 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6fff3c05-b002-4910-8909-665295c5d940\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.528612 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkmhv\" (UniqueName: \"kubernetes.io/projected/6fff3c05-b002-4910-8909-665295c5d940-kube-api-access-rkmhv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6fff3c05-b002-4910-8909-665295c5d940\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.532590 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6fff3c05-b002-4910-8909-665295c5d940\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.621823 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.954456 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 04:16:31 crc kubenswrapper[4744]: I0930 04:16:31.970991 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 04:16:32 crc kubenswrapper[4744]: I0930 04:16:32.777735 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6fff3c05-b002-4910-8909-665295c5d940","Type":"ContainerStarted","Data":"faa911351eb4ade18ba7fb47c0fe46e2ebfd4e0ca35fd2d4af019b25f039e64e"} Sep 30 04:16:34 crc kubenswrapper[4744]: I0930 04:16:34.348244 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:16:34 crc kubenswrapper[4744]: I0930 04:16:34.348655 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:16:34 crc kubenswrapper[4744]: I0930 04:16:34.800741 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6fff3c05-b002-4910-8909-665295c5d940","Type":"ContainerStarted","Data":"47e4a54a175a5c30d5f6c2a1b7688403b57d060f1bf5ee20772900878852663f"} Sep 30 04:16:34 crc kubenswrapper[4744]: I0930 04:16:34.842171 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.936764218 podStartE2EDuration="3.842144536s" podCreationTimestamp="2025-09-30 04:16:31 +0000 UTC" firstStartedPulling="2025-09-30 04:16:31.97064004 +0000 UTC m=+4919.143860054" lastFinishedPulling="2025-09-30 04:16:33.876020358 +0000 UTC m=+4921.049240372" observedRunningTime="2025-09-30 04:16:34.839884865 +0000 UTC m=+4922.013104899" watchObservedRunningTime="2025-09-30 04:16:34.842144536 +0000 UTC m=+4922.015364540" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.717378 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7hzl/must-gather-gpgls"] Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.719605 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.721880 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l7hzl"/"openshift-service-ca.crt" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.722158 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-l7hzl"/"default-dockercfg-4q2xb" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.722553 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l7hzl"/"kube-root-ca.crt" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.734733 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7hzl/must-gather-gpgls"] Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.739251 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc11d67a-25a6-4242-ab75-e6820419a269-must-gather-output\") pod \"must-gather-gpgls\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.739657 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm4k8\" (UniqueName: \"kubernetes.io/projected/cc11d67a-25a6-4242-ab75-e6820419a269-kube-api-access-rm4k8\") pod \"must-gather-gpgls\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.841319 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm4k8\" (UniqueName: \"kubernetes.io/projected/cc11d67a-25a6-4242-ab75-e6820419a269-kube-api-access-rm4k8\") pod \"must-gather-gpgls\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.841632 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc11d67a-25a6-4242-ab75-e6820419a269-must-gather-output\") pod \"must-gather-gpgls\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.842043 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc11d67a-25a6-4242-ab75-e6820419a269-must-gather-output\") pod \"must-gather-gpgls\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:50 crc kubenswrapper[4744]: I0930 04:16:50.870554 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm4k8\" (UniqueName: \"kubernetes.io/projected/cc11d67a-25a6-4242-ab75-e6820419a269-kube-api-access-rm4k8\") pod \"must-gather-gpgls\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:51 crc kubenswrapper[4744]: I0930 04:16:51.038110 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:16:51 crc kubenswrapper[4744]: I0930 04:16:51.491213 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l7hzl/must-gather-gpgls"] Sep 30 04:16:52 crc kubenswrapper[4744]: I0930 04:16:52.023549 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/must-gather-gpgls" event={"ID":"cc11d67a-25a6-4242-ab75-e6820419a269","Type":"ContainerStarted","Data":"b45de36da5e368e7c61809906504faffd5a7f8e5b63927e3f0cab88bc180898d"} Sep 30 04:16:57 crc kubenswrapper[4744]: I0930 04:16:57.090121 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/must-gather-gpgls" event={"ID":"cc11d67a-25a6-4242-ab75-e6820419a269","Type":"ContainerStarted","Data":"b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18"} Sep 30 04:16:58 crc kubenswrapper[4744]: I0930 04:16:58.103554 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/must-gather-gpgls" event={"ID":"cc11d67a-25a6-4242-ab75-e6820419a269","Type":"ContainerStarted","Data":"cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400"} Sep 30 04:16:58 crc kubenswrapper[4744]: I0930 04:16:58.125018 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7hzl/must-gather-gpgls" podStartSLOduration=3.595391474 podStartE2EDuration="8.125001416s" podCreationTimestamp="2025-09-30 04:16:50 +0000 UTC" firstStartedPulling="2025-09-30 04:16:51.491125533 +0000 UTC m=+4938.664345547" lastFinishedPulling="2025-09-30 04:16:56.020735505 +0000 UTC m=+4943.193955489" observedRunningTime="2025-09-30 04:16:58.11968514 +0000 UTC m=+4945.292905164" watchObservedRunningTime="2025-09-30 04:16:58.125001416 +0000 UTC m=+4945.298221390" Sep 30 04:17:02 crc kubenswrapper[4744]: I0930 04:17:02.972157 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-c5gb8"] Sep 30 04:17:02 crc kubenswrapper[4744]: I0930 04:17:02.974004 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:03 crc kubenswrapper[4744]: I0930 04:17:03.006020 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d76q\" (UniqueName: \"kubernetes.io/projected/637dc2e4-57e0-46f2-b85b-7432bbb2149b-kube-api-access-5d76q\") pod \"crc-debug-c5gb8\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:03 crc kubenswrapper[4744]: I0930 04:17:03.006069 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/637dc2e4-57e0-46f2-b85b-7432bbb2149b-host\") pod \"crc-debug-c5gb8\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:03 crc kubenswrapper[4744]: I0930 04:17:03.107390 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d76q\" (UniqueName: \"kubernetes.io/projected/637dc2e4-57e0-46f2-b85b-7432bbb2149b-kube-api-access-5d76q\") pod \"crc-debug-c5gb8\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:03 crc kubenswrapper[4744]: I0930 04:17:03.107468 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/637dc2e4-57e0-46f2-b85b-7432bbb2149b-host\") pod \"crc-debug-c5gb8\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:03 crc kubenswrapper[4744]: I0930 04:17:03.107652 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/637dc2e4-57e0-46f2-b85b-7432bbb2149b-host\") pod \"crc-debug-c5gb8\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:03 crc kubenswrapper[4744]: I0930 04:17:03.136987 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d76q\" (UniqueName: \"kubernetes.io/projected/637dc2e4-57e0-46f2-b85b-7432bbb2149b-kube-api-access-5d76q\") pod \"crc-debug-c5gb8\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:03 crc kubenswrapper[4744]: I0930 04:17:03.289333 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:17:04 crc kubenswrapper[4744]: I0930 04:17:04.166521 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" event={"ID":"637dc2e4-57e0-46f2-b85b-7432bbb2149b","Type":"ContainerStarted","Data":"341d9ebf5260a4ba9e1cd181ea691df290abec55dc168011292a9f5bc05672fd"} Sep 30 04:17:04 crc kubenswrapper[4744]: I0930 04:17:04.347656 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:17:04 crc kubenswrapper[4744]: I0930 04:17:04.347718 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:17:04 crc kubenswrapper[4744]: I0930 04:17:04.347757 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 04:17:04 crc kubenswrapper[4744]: I0930 04:17:04.348303 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 04:17:04 crc kubenswrapper[4744]: I0930 04:17:04.348354 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" gracePeriod=600 Sep 30 04:17:04 crc kubenswrapper[4744]: E0930 04:17:04.475832 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:17:05 crc kubenswrapper[4744]: I0930 04:17:05.177909 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" exitCode=0 Sep 30 04:17:05 crc kubenswrapper[4744]: I0930 04:17:05.177959 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1"} Sep 30 04:17:05 crc kubenswrapper[4744]: I0930 04:17:05.177996 4744 scope.go:117] "RemoveContainer" containerID="84f93c804bb50a2966360c410d3376d06cb89fd915c6123ac6a23b01e8432d05" Sep 30 04:17:05 crc kubenswrapper[4744]: I0930 04:17:05.178713 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:17:05 crc kubenswrapper[4744]: E0930 04:17:05.179027 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:17:14 crc kubenswrapper[4744]: I0930 04:17:14.282510 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" event={"ID":"637dc2e4-57e0-46f2-b85b-7432bbb2149b","Type":"ContainerStarted","Data":"899a3a4e1b091e49790cd500b6ec90a076752cf4c589bd8b4ee2232df6911063"} Sep 30 04:17:14 crc kubenswrapper[4744]: I0930 04:17:14.305535 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" podStartSLOduration=1.899287173 podStartE2EDuration="12.305510664s" podCreationTimestamp="2025-09-30 04:17:02 +0000 UTC" firstStartedPulling="2025-09-30 04:17:03.329650569 +0000 UTC m=+4950.502870543" lastFinishedPulling="2025-09-30 04:17:13.73587405 +0000 UTC m=+4960.909094034" observedRunningTime="2025-09-30 04:17:14.298516656 +0000 UTC m=+4961.471736670" watchObservedRunningTime="2025-09-30 04:17:14.305510664 +0000 UTC m=+4961.478730668" Sep 30 04:17:17 crc kubenswrapper[4744]: I0930 04:17:17.503402 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:17:17 crc kubenswrapper[4744]: E0930 04:17:17.504141 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:17:30 crc kubenswrapper[4744]: I0930 04:17:30.504143 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:17:30 crc kubenswrapper[4744]: E0930 04:17:30.505839 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:17:43 crc kubenswrapper[4744]: I0930 04:17:43.546157 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:17:43 crc kubenswrapper[4744]: E0930 04:17:43.549734 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:17:56 crc kubenswrapper[4744]: I0930 04:17:56.503802 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:17:56 crc kubenswrapper[4744]: E0930 04:17:56.504490 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:18:10 crc kubenswrapper[4744]: I0930 04:18:10.503140 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:18:10 crc kubenswrapper[4744]: E0930 04:18:10.503805 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:18:13 crc kubenswrapper[4744]: I0930 04:18:13.270605 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b554c468b-9gtqj_fbf5b5e2-d32c-4714-864b-06e2f15dd3ce/barbican-api/0.log" Sep 30 04:18:13 crc kubenswrapper[4744]: I0930 04:18:13.284871 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b554c468b-9gtqj_fbf5b5e2-d32c-4714-864b-06e2f15dd3ce/barbican-api-log/0.log" Sep 30 04:18:13 crc kubenswrapper[4744]: I0930 04:18:13.457412 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-578ccf57db-dnd4k_89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e/barbican-keystone-listener/0.log" Sep 30 04:18:13 crc kubenswrapper[4744]: I0930 04:18:13.698419 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58944b8f99-bl9hx_fb5969c0-4230-4813-9009-546eda8657eb/barbican-worker/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.013300 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58944b8f99-bl9hx_fb5969c0-4230-4813-9009-546eda8657eb/barbican-worker-log/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.130808 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-578ccf57db-dnd4k_89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e/barbican-keystone-listener-log/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.273846 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq_2abd1aec-872e-4bcb-a05f-c0d04d689489/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.376407 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/ceilometer-central-agent/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.496410 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/proxy-httpd/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.500219 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/ceilometer-notification-agent/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.585807 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/sg-core/0.log" Sep 30 04:18:14 crc kubenswrapper[4744]: I0930 04:18:14.929017 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_800e9149-6d7e-4196-bad2-e747131c3e34/ceph/0.log" Sep 30 04:18:15 crc kubenswrapper[4744]: I0930 04:18:15.107721 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5269829d-b1f7-4980-9550-d622fa40c1f1/cinder-api/0.log" Sep 30 04:18:15 crc kubenswrapper[4744]: I0930 04:18:15.116338 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5269829d-b1f7-4980-9550-d622fa40c1f1/cinder-api-log/0.log" Sep 30 04:18:15 crc kubenswrapper[4744]: I0930 04:18:15.262574 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6cc7863-b10e-47a4-bd86-5c66436d4af4/probe/0.log" Sep 30 04:18:15 crc kubenswrapper[4744]: I0930 04:18:15.346326 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1ba32e4a-2e93-4483-9acf-a7a72792b0f6/cinder-scheduler/0.log" Sep 30 04:18:15 crc kubenswrapper[4744]: I0930 04:18:15.599494 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1ba32e4a-2e93-4483-9acf-a7a72792b0f6/probe/0.log" Sep 30 04:18:15 crc kubenswrapper[4744]: I0930 04:18:15.873608 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_597b8dc3-9c8f-48c4-b554-7d8564395142/probe/0.log" Sep 30 04:18:16 crc kubenswrapper[4744]: I0930 04:18:16.115501 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n_bb8619b8-471f-4b9c-a9ee-97f668713bec/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:16 crc kubenswrapper[4744]: I0930 04:18:16.298560 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pp49d_3c88be7c-d782-4d4c-9110-997c89d8261e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:16 crc kubenswrapper[4744]: I0930 04:18:16.501601 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-2fpz5_166b326f-c29c-48e9-b017-034c02b4d448/init/0.log" Sep 30 04:18:16 crc kubenswrapper[4744]: I0930 04:18:16.511221 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6cc7863-b10e-47a4-bd86-5c66436d4af4/cinder-backup/0.log" Sep 30 04:18:16 crc kubenswrapper[4744]: I0930 04:18:16.706301 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-2fpz5_166b326f-c29c-48e9-b017-034c02b4d448/init/0.log" Sep 30 04:18:16 crc kubenswrapper[4744]: I0930 04:18:16.905237 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-2fpz5_166b326f-c29c-48e9-b017-034c02b4d448/dnsmasq-dns/0.log" Sep 30 04:18:16 crc kubenswrapper[4744]: I0930 04:18:16.907202 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2_1ec7b740-1236-48b8-9aa5-0fd0c2f64380/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:17 crc kubenswrapper[4744]: I0930 04:18:17.127885 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6a6b749-14c4-4726-b176-160667e2651d/glance-httpd/0.log" Sep 30 04:18:17 crc kubenswrapper[4744]: I0930 04:18:17.138011 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6a6b749-14c4-4726-b176-160667e2651d/glance-log/0.log" Sep 30 04:18:17 crc kubenswrapper[4744]: I0930 04:18:17.369185 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa/glance-log/0.log" Sep 30 04:18:17 crc kubenswrapper[4744]: I0930 04:18:17.433013 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa/glance-httpd/0.log" Sep 30 04:18:17 crc kubenswrapper[4744]: I0930 04:18:17.715361 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78db449746-kg7zl_ff31735f-472e-4b3a-8d81-bc5c392aec09/horizon/0.log" Sep 30 04:18:17 crc kubenswrapper[4744]: I0930 04:18:17.930258 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-4thcz_de456177-d85a-41d5-aa9f-f7d7d6f68e21/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:18 crc kubenswrapper[4744]: I0930 04:18:18.144501 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-frfr6_8a000c1d-f61a-4bb0-8041-acf07914d4de/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:18 crc kubenswrapper[4744]: I0930 04:18:18.277807 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_597b8dc3-9c8f-48c4-b554-7d8564395142/cinder-volume/0.log" Sep 30 04:18:18 crc kubenswrapper[4744]: I0930 04:18:18.307727 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78db449746-kg7zl_ff31735f-472e-4b3a-8d81-bc5c392aec09/horizon-log/0.log" Sep 30 04:18:18 crc kubenswrapper[4744]: I0930 04:18:18.502110 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320081-57dhj_6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec/keystone-cron/0.log" Sep 30 04:18:18 crc kubenswrapper[4744]: I0930 04:18:18.737903 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_825e7c08-c607-429e-bf96-d8c332d03cd1/kube-state-metrics/0.log" Sep 30 04:18:18 crc kubenswrapper[4744]: I0930 04:18:18.858536 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh_fc1867d3-bb6f-4fca-876d-b868bcd284bb/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:19 crc kubenswrapper[4744]: I0930 04:18:19.267529 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a1d320da-1463-4d51-beff-da49872cdb35/manila-api/0.log" Sep 30 04:18:19 crc kubenswrapper[4744]: I0930 04:18:19.456488 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_34aed00c-8bca-400a-bea5-1e7966a35388/probe/0.log" Sep 30 04:18:19 crc kubenswrapper[4744]: I0930 04:18:19.717515 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_34aed00c-8bca-400a-bea5-1e7966a35388/manila-scheduler/0.log" Sep 30 04:18:19 crc kubenswrapper[4744]: I0930 04:18:19.891439 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a1d320da-1463-4d51-beff-da49872cdb35/manila-api-log/0.log" Sep 30 04:18:19 crc kubenswrapper[4744]: I0930 04:18:19.932256 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8f4d1853-2bfc-4470-be87-65c81ff45b97/probe/0.log" Sep 30 04:18:20 crc kubenswrapper[4744]: I0930 04:18:20.185398 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8f4d1853-2bfc-4470-be87-65c81ff45b97/manila-share/0.log" Sep 30 04:18:21 crc kubenswrapper[4744]: I0930 04:18:21.236862 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-59ddc4db88-d9q99_02356cb4-2497-483a-9742-acd6b9080dc2/keystone-api/0.log" Sep 30 04:18:21 crc kubenswrapper[4744]: I0930 04:18:21.504188 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:18:21 crc kubenswrapper[4744]: E0930 04:18:21.505002 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:18:21 crc kubenswrapper[4744]: I0930 04:18:21.608043 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5578f9874f-7lb9c_ecbf3c72-f1cb-48fd-8823-3d3ae2040c86/neutron-httpd/0.log" Sep 30 04:18:21 crc kubenswrapper[4744]: I0930 04:18:21.677132 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb_ced06625-11b0-4e49-9874-9f627107037c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:22 crc kubenswrapper[4744]: I0930 04:18:22.055848 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5578f9874f-7lb9c_ecbf3c72-f1cb-48fd-8823-3d3ae2040c86/neutron-api/0.log" Sep 30 04:18:23 crc kubenswrapper[4744]: I0930 04:18:23.192589 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9d37d81d-59fb-4686-b8b9-34ba95b98cb2/nova-cell0-conductor-conductor/0.log" Sep 30 04:18:23 crc kubenswrapper[4744]: I0930 04:18:23.787945 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_45c58372-9d54-41ad-8059-5666ff3ab3c6/nova-cell1-conductor-conductor/0.log" Sep 30 04:18:24 crc kubenswrapper[4744]: I0930 04:18:24.312645 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_843d7ca4-8741-4c46-9e24-c432261d5c57/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 04:18:24 crc kubenswrapper[4744]: I0930 04:18:24.358859 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5a7ff737-dbb5-4e5c-9862-6b99f8584fc4/nova-api-log/0.log" Sep 30 04:18:24 crc kubenswrapper[4744]: I0930 04:18:24.632181 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qj5zd_a0abbf7a-4bd2-4a60-a571-68eae4ea321c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:24 crc kubenswrapper[4744]: I0930 04:18:24.951213 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9aebe30-d132-461b-ad9b-fa6bc9f1227b/nova-metadata-log/0.log" Sep 30 04:18:24 crc kubenswrapper[4744]: I0930 04:18:24.973864 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5a7ff737-dbb5-4e5c-9862-6b99f8584fc4/nova-api-api/0.log" Sep 30 04:18:25 crc kubenswrapper[4744]: I0930 04:18:25.556986 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c1131b4e-532d-478b-bbd8-b52963f60462/mysql-bootstrap/0.log" Sep 30 04:18:25 crc kubenswrapper[4744]: I0930 04:18:25.558140 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_897dfc3d-b2fa-4a22-b5a9-e2ce2c486801/nova-scheduler-scheduler/0.log" Sep 30 04:18:25 crc kubenswrapper[4744]: I0930 04:18:25.751319 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c1131b4e-532d-478b-bbd8-b52963f60462/mysql-bootstrap/0.log" Sep 30 04:18:25 crc kubenswrapper[4744]: I0930 04:18:25.773749 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c1131b4e-532d-478b-bbd8-b52963f60462/galera/0.log" Sep 30 04:18:25 crc kubenswrapper[4744]: I0930 04:18:25.984104 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ddf3db46-b4d2-469a-bc2e-dc5610bb2807/mysql-bootstrap/0.log" Sep 30 04:18:26 crc kubenswrapper[4744]: I0930 04:18:26.157432 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ddf3db46-b4d2-469a-bc2e-dc5610bb2807/mysql-bootstrap/0.log" Sep 30 04:18:26 crc kubenswrapper[4744]: I0930 04:18:26.224773 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ddf3db46-b4d2-469a-bc2e-dc5610bb2807/galera/0.log" Sep 30 04:18:26 crc kubenswrapper[4744]: I0930 04:18:26.439078 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_952fb37a-2fb5-41f5-a9f6-195c94862274/openstackclient/0.log" Sep 30 04:18:26 crc kubenswrapper[4744]: I0930 04:18:26.618102 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m95jr_6aa7757e-eced-4195-8b1d-88fd7a3b322d/ovn-controller/0.log" Sep 30 04:18:26 crc kubenswrapper[4744]: I0930 04:18:26.828900 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-227s2_dfc48401-bc82-4227-a5f2-22b7b5699433/openstack-network-exporter/0.log" Sep 30 04:18:26 crc kubenswrapper[4744]: I0930 04:18:26.895343 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9aebe30-d132-461b-ad9b-fa6bc9f1227b/nova-metadata-metadata/0.log" Sep 30 04:18:26 crc kubenswrapper[4744]: I0930 04:18:26.998281 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovsdb-server-init/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.192763 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovsdb-server-init/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.248290 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovs-vswitchd/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.255478 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovsdb-server/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.417459 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-vtk6h_4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.598755 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_09a4d14b-16a0-442c-8444-af404618ae96/openstack-network-exporter/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.618295 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_09a4d14b-16a0-442c-8444-af404618ae96/ovn-northd/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.839421 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7bfc1c21-6422-4308-8370-2dd0b26a3c1e/openstack-network-exporter/0.log" Sep 30 04:18:27 crc kubenswrapper[4744]: I0930 04:18:27.845843 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7bfc1c21-6422-4308-8370-2dd0b26a3c1e/ovsdbserver-nb/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.040534 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e0e55f0-f333-4bc6-9905-18adf601fb9c/openstack-network-exporter/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.052264 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e0e55f0-f333-4bc6-9905-18adf601fb9c/ovsdbserver-sb/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.653642 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ddc58d856-kwfp8_825a9fa1-9368-48f2-9baa-1b8390d0cd3a/placement-api/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.675638 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0eb33cf6-e46d-4f10-b794-6707d21fc4ab/memcached/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.694160 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_341a2cff-5aae-4952-a8d8-64d5e247d7f9/setup-container/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.724019 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ddc58d856-kwfp8_825a9fa1-9368-48f2-9baa-1b8390d0cd3a/placement-log/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.820954 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_341a2cff-5aae-4952-a8d8-64d5e247d7f9/setup-container/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.890521 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_341a2cff-5aae-4952-a8d8-64d5e247d7f9/rabbitmq/0.log" Sep 30 04:18:28 crc kubenswrapper[4744]: I0930 04:18:28.930405 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7d180fc4-3fb0-4db5-99d7-913559d8ec2e/setup-container/0.log" Sep 30 04:18:29 crc kubenswrapper[4744]: I0930 04:18:29.089570 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7d180fc4-3fb0-4db5-99d7-913559d8ec2e/rabbitmq/0.log" Sep 30 04:18:29 crc kubenswrapper[4744]: I0930 04:18:29.105841 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn_3e4f8446-ac54-4cff-b7f3-025ced28cc74/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:29 crc kubenswrapper[4744]: I0930 04:18:29.106309 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7d180fc4-3fb0-4db5-99d7-913559d8ec2e/setup-container/0.log" Sep 30 04:18:29 crc kubenswrapper[4744]: I0930 04:18:29.766979 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-299d4_50f909f5-fbe0-489d-bb41-59a3318cd416/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:29 crc kubenswrapper[4744]: I0930 04:18:29.774682 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m_1b061989-0be6-4c0d-800f-05bedb5c9a90/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:29 crc kubenswrapper[4744]: I0930 04:18:29.958216 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-q66n7_13c93dcf-8343-45ef-a4cf-3f411d5311e1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.074023 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p4lhn_3aa13f38-7b2d-4f65-8ca1-0de736d1f291/ssh-known-hosts-edpm-deployment/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.189972 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64cfcf86c-tq8s6_2a42e069-1859-4077-8f50-8b285465b47a/proxy-server/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.304185 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64cfcf86c-tq8s6_2a42e069-1859-4077-8f50-8b285465b47a/proxy-httpd/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.400931 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qcpwz_35a9b94f-3d1b-40a3-9bcf-279d796e86d9/swift-ring-rebalance/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.494341 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-auditor/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.500277 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-reaper/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.595773 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-replicator/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.648279 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-server/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.661935 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-auditor/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.720429 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-replicator/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.768344 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-server/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.808191 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-updater/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.897312 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-auditor/0.log" Sep 30 04:18:30 crc kubenswrapper[4744]: I0930 04:18:30.914102 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-expirer/0.log" Sep 30 04:18:31 crc kubenswrapper[4744]: I0930 04:18:31.777030 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-server/0.log" Sep 30 04:18:31 crc kubenswrapper[4744]: I0930 04:18:31.799344 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-replicator/0.log" Sep 30 04:18:31 crc kubenswrapper[4744]: I0930 04:18:31.848259 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-updater/0.log" Sep 30 04:18:31 crc kubenswrapper[4744]: I0930 04:18:31.915183 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/rsync/0.log" Sep 30 04:18:31 crc kubenswrapper[4744]: I0930 04:18:31.989441 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/swift-recon-cron/0.log" Sep 30 04:18:32 crc kubenswrapper[4744]: I0930 04:18:32.069656 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-rkxts_cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:32 crc kubenswrapper[4744]: I0930 04:18:32.246360 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6fff3c05-b002-4910-8909-665295c5d940/test-operator-logs-container/0.log" Sep 30 04:18:32 crc kubenswrapper[4744]: I0930 04:18:32.264051 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f4a78f7a-b5bc-4636-81df-578f5105bce3/tempest-tests-tempest-tests-runner/0.log" Sep 30 04:18:32 crc kubenswrapper[4744]: I0930 04:18:32.419171 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb_287b3de6-0593-428e-80d8-b70b360a7d41/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:18:36 crc kubenswrapper[4744]: I0930 04:18:36.504196 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:18:36 crc kubenswrapper[4744]: E0930 04:18:36.504924 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:18:51 crc kubenswrapper[4744]: I0930 04:18:51.504015 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:18:51 crc kubenswrapper[4744]: E0930 04:18:51.504813 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:19:02 crc kubenswrapper[4744]: I0930 04:19:02.503564 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:19:02 crc kubenswrapper[4744]: E0930 04:19:02.504811 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:19:11 crc kubenswrapper[4744]: I0930 04:19:11.404026 4744 generic.go:334] "Generic (PLEG): container finished" podID="637dc2e4-57e0-46f2-b85b-7432bbb2149b" containerID="899a3a4e1b091e49790cd500b6ec90a076752cf4c589bd8b4ee2232df6911063" exitCode=0 Sep 30 04:19:11 crc kubenswrapper[4744]: I0930 04:19:11.404160 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" event={"ID":"637dc2e4-57e0-46f2-b85b-7432bbb2149b","Type":"ContainerDied","Data":"899a3a4e1b091e49790cd500b6ec90a076752cf4c589bd8b4ee2232df6911063"} Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.272397 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.330562 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-c5gb8"] Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.340532 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-c5gb8"] Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.372741 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/637dc2e4-57e0-46f2-b85b-7432bbb2149b-host\") pod \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.372882 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/637dc2e4-57e0-46f2-b85b-7432bbb2149b-host" (OuterVolumeSpecName: "host") pod "637dc2e4-57e0-46f2-b85b-7432bbb2149b" (UID: "637dc2e4-57e0-46f2-b85b-7432bbb2149b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.373104 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d76q\" (UniqueName: \"kubernetes.io/projected/637dc2e4-57e0-46f2-b85b-7432bbb2149b-kube-api-access-5d76q\") pod \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\" (UID: \"637dc2e4-57e0-46f2-b85b-7432bbb2149b\") " Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.373715 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/637dc2e4-57e0-46f2-b85b-7432bbb2149b-host\") on node \"crc\" DevicePath \"\"" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.379764 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637dc2e4-57e0-46f2-b85b-7432bbb2149b-kube-api-access-5d76q" (OuterVolumeSpecName: "kube-api-access-5d76q") pod "637dc2e4-57e0-46f2-b85b-7432bbb2149b" (UID: "637dc2e4-57e0-46f2-b85b-7432bbb2149b"). InnerVolumeSpecName "kube-api-access-5d76q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.426568 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="341d9ebf5260a4ba9e1cd181ea691df290abec55dc168011292a9f5bc05672fd" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.426636 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-c5gb8" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.475604 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d76q\" (UniqueName: \"kubernetes.io/projected/637dc2e4-57e0-46f2-b85b-7432bbb2149b-kube-api-access-5d76q\") on node \"crc\" DevicePath \"\"" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.510281 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:19:13 crc kubenswrapper[4744]: E0930 04:19:13.510750 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:19:13 crc kubenswrapper[4744]: I0930 04:19:13.518743 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637dc2e4-57e0-46f2-b85b-7432bbb2149b" path="/var/lib/kubelet/pods/637dc2e4-57e0-46f2-b85b-7432bbb2149b/volumes" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.534832 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-zq2lx"] Sep 30 04:19:14 crc kubenswrapper[4744]: E0930 04:19:14.535809 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637dc2e4-57e0-46f2-b85b-7432bbb2149b" containerName="container-00" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.535833 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="637dc2e4-57e0-46f2-b85b-7432bbb2149b" containerName="container-00" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.536266 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="637dc2e4-57e0-46f2-b85b-7432bbb2149b" containerName="container-00" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.537152 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.652885 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m68jq\" (UniqueName: \"kubernetes.io/projected/df24aa3f-635f-445b-ba07-bd7774073d06-kube-api-access-m68jq\") pod \"crc-debug-zq2lx\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.653195 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df24aa3f-635f-445b-ba07-bd7774073d06-host\") pod \"crc-debug-zq2lx\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.755448 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df24aa3f-635f-445b-ba07-bd7774073d06-host\") pod \"crc-debug-zq2lx\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.755608 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df24aa3f-635f-445b-ba07-bd7774073d06-host\") pod \"crc-debug-zq2lx\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.755694 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m68jq\" (UniqueName: \"kubernetes.io/projected/df24aa3f-635f-445b-ba07-bd7774073d06-kube-api-access-m68jq\") pod \"crc-debug-zq2lx\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.775556 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m68jq\" (UniqueName: \"kubernetes.io/projected/df24aa3f-635f-445b-ba07-bd7774073d06-kube-api-access-m68jq\") pod \"crc-debug-zq2lx\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:14 crc kubenswrapper[4744]: I0930 04:19:14.854713 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:15 crc kubenswrapper[4744]: I0930 04:19:15.446972 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" event={"ID":"df24aa3f-635f-445b-ba07-bd7774073d06","Type":"ContainerStarted","Data":"223b746d6eaba590555501ad42b5de497c131a638771d65a4f7cad40406524e8"} Sep 30 04:19:15 crc kubenswrapper[4744]: I0930 04:19:15.447308 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" event={"ID":"df24aa3f-635f-445b-ba07-bd7774073d06","Type":"ContainerStarted","Data":"f134524972ccacaef5fac7b3561c95fa6705a4d8e63a7006354a6fbefd5801bd"} Sep 30 04:19:15 crc kubenswrapper[4744]: I0930 04:19:15.477911 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" podStartSLOduration=1.477867759 podStartE2EDuration="1.477867759s" podCreationTimestamp="2025-09-30 04:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 04:19:15.462668945 +0000 UTC m=+5082.635888959" watchObservedRunningTime="2025-09-30 04:19:15.477867759 +0000 UTC m=+5082.651087753" Sep 30 04:19:16 crc kubenswrapper[4744]: I0930 04:19:16.457426 4744 generic.go:334] "Generic (PLEG): container finished" podID="df24aa3f-635f-445b-ba07-bd7774073d06" containerID="223b746d6eaba590555501ad42b5de497c131a638771d65a4f7cad40406524e8" exitCode=0 Sep 30 04:19:16 crc kubenswrapper[4744]: I0930 04:19:16.457704 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" event={"ID":"df24aa3f-635f-445b-ba07-bd7774073d06","Type":"ContainerDied","Data":"223b746d6eaba590555501ad42b5de497c131a638771d65a4f7cad40406524e8"} Sep 30 04:19:17 crc kubenswrapper[4744]: I0930 04:19:17.587590 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:17 crc kubenswrapper[4744]: I0930 04:19:17.706957 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df24aa3f-635f-445b-ba07-bd7774073d06-host\") pod \"df24aa3f-635f-445b-ba07-bd7774073d06\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " Sep 30 04:19:17 crc kubenswrapper[4744]: I0930 04:19:17.707035 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m68jq\" (UniqueName: \"kubernetes.io/projected/df24aa3f-635f-445b-ba07-bd7774073d06-kube-api-access-m68jq\") pod \"df24aa3f-635f-445b-ba07-bd7774073d06\" (UID: \"df24aa3f-635f-445b-ba07-bd7774073d06\") " Sep 30 04:19:17 crc kubenswrapper[4744]: I0930 04:19:17.707061 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df24aa3f-635f-445b-ba07-bd7774073d06-host" (OuterVolumeSpecName: "host") pod "df24aa3f-635f-445b-ba07-bd7774073d06" (UID: "df24aa3f-635f-445b-ba07-bd7774073d06"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 04:19:17 crc kubenswrapper[4744]: I0930 04:19:17.707745 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df24aa3f-635f-445b-ba07-bd7774073d06-host\") on node \"crc\" DevicePath \"\"" Sep 30 04:19:17 crc kubenswrapper[4744]: I0930 04:19:17.723559 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df24aa3f-635f-445b-ba07-bd7774073d06-kube-api-access-m68jq" (OuterVolumeSpecName: "kube-api-access-m68jq") pod "df24aa3f-635f-445b-ba07-bd7774073d06" (UID: "df24aa3f-635f-445b-ba07-bd7774073d06"). InnerVolumeSpecName "kube-api-access-m68jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:19:17 crc kubenswrapper[4744]: I0930 04:19:17.808955 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m68jq\" (UniqueName: \"kubernetes.io/projected/df24aa3f-635f-445b-ba07-bd7774073d06-kube-api-access-m68jq\") on node \"crc\" DevicePath \"\"" Sep 30 04:19:18 crc kubenswrapper[4744]: I0930 04:19:18.473987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" event={"ID":"df24aa3f-635f-445b-ba07-bd7774073d06","Type":"ContainerDied","Data":"f134524972ccacaef5fac7b3561c95fa6705a4d8e63a7006354a6fbefd5801bd"} Sep 30 04:19:18 crc kubenswrapper[4744]: I0930 04:19:18.474031 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-zq2lx" Sep 30 04:19:18 crc kubenswrapper[4744]: I0930 04:19:18.474044 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f134524972ccacaef5fac7b3561c95fa6705a4d8e63a7006354a6fbefd5801bd" Sep 30 04:19:24 crc kubenswrapper[4744]: I0930 04:19:24.505237 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:19:24 crc kubenswrapper[4744]: E0930 04:19:24.505966 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:19:24 crc kubenswrapper[4744]: I0930 04:19:24.633807 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-zq2lx"] Sep 30 04:19:24 crc kubenswrapper[4744]: I0930 04:19:24.640919 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-zq2lx"] Sep 30 04:19:25 crc kubenswrapper[4744]: I0930 04:19:25.544945 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df24aa3f-635f-445b-ba07-bd7774073d06" path="/var/lib/kubelet/pods/df24aa3f-635f-445b-ba07-bd7774073d06/volumes" Sep 30 04:19:25 crc kubenswrapper[4744]: I0930 04:19:25.875888 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-7dnk9"] Sep 30 04:19:25 crc kubenswrapper[4744]: E0930 04:19:25.876818 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df24aa3f-635f-445b-ba07-bd7774073d06" containerName="container-00" Sep 30 04:19:25 crc kubenswrapper[4744]: I0930 04:19:25.876979 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="df24aa3f-635f-445b-ba07-bd7774073d06" containerName="container-00" Sep 30 04:19:25 crc kubenswrapper[4744]: I0930 04:19:25.877470 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="df24aa3f-635f-445b-ba07-bd7774073d06" containerName="container-00" Sep 30 04:19:25 crc kubenswrapper[4744]: I0930 04:19:25.878716 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:25 crc kubenswrapper[4744]: I0930 04:19:25.969581 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44d71a3-fca9-465f-8138-8463dda58d72-host\") pod \"crc-debug-7dnk9\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:25 crc kubenswrapper[4744]: I0930 04:19:25.969768 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25mdj\" (UniqueName: \"kubernetes.io/projected/f44d71a3-fca9-465f-8138-8463dda58d72-kube-api-access-25mdj\") pod \"crc-debug-7dnk9\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.072144 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25mdj\" (UniqueName: \"kubernetes.io/projected/f44d71a3-fca9-465f-8138-8463dda58d72-kube-api-access-25mdj\") pod \"crc-debug-7dnk9\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.072649 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44d71a3-fca9-465f-8138-8463dda58d72-host\") pod \"crc-debug-7dnk9\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.072899 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44d71a3-fca9-465f-8138-8463dda58d72-host\") pod \"crc-debug-7dnk9\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.113270 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25mdj\" (UniqueName: \"kubernetes.io/projected/f44d71a3-fca9-465f-8138-8463dda58d72-kube-api-access-25mdj\") pod \"crc-debug-7dnk9\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.207407 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:26 crc kubenswrapper[4744]: W0930 04:19:26.266895 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf44d71a3_fca9_465f_8138_8463dda58d72.slice/crio-1da18bfbb4cd64d718468450bd922fd838d209d82fe66b5f24a4f9d67c9e91fb WatchSource:0}: Error finding container 1da18bfbb4cd64d718468450bd922fd838d209d82fe66b5f24a4f9d67c9e91fb: Status 404 returned error can't find the container with id 1da18bfbb4cd64d718468450bd922fd838d209d82fe66b5f24a4f9d67c9e91fb Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.589361 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" event={"ID":"f44d71a3-fca9-465f-8138-8463dda58d72","Type":"ContainerStarted","Data":"b8ecabd6c5fdd546f26a5df55ec9e9041867248e43efcee8e05539bc01eeed33"} Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.589739 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" event={"ID":"f44d71a3-fca9-465f-8138-8463dda58d72","Type":"ContainerStarted","Data":"1da18bfbb4cd64d718468450bd922fd838d209d82fe66b5f24a4f9d67c9e91fb"} Sep 30 04:19:26 crc kubenswrapper[4744]: I0930 04:19:26.608629 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" podStartSLOduration=1.6086102740000001 podStartE2EDuration="1.608610274s" podCreationTimestamp="2025-09-30 04:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 04:19:26.602627697 +0000 UTC m=+5093.775847671" watchObservedRunningTime="2025-09-30 04:19:26.608610274 +0000 UTC m=+5093.781830248" Sep 30 04:19:27 crc kubenswrapper[4744]: I0930 04:19:27.611328 4744 generic.go:334] "Generic (PLEG): container finished" podID="f44d71a3-fca9-465f-8138-8463dda58d72" containerID="b8ecabd6c5fdd546f26a5df55ec9e9041867248e43efcee8e05539bc01eeed33" exitCode=0 Sep 30 04:19:27 crc kubenswrapper[4744]: I0930 04:19:27.611416 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" event={"ID":"f44d71a3-fca9-465f-8138-8463dda58d72","Type":"ContainerDied","Data":"b8ecabd6c5fdd546f26a5df55ec9e9041867248e43efcee8e05539bc01eeed33"} Sep 30 04:19:28 crc kubenswrapper[4744]: I0930 04:19:28.770227 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:28 crc kubenswrapper[4744]: I0930 04:19:28.814397 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-7dnk9"] Sep 30 04:19:28 crc kubenswrapper[4744]: I0930 04:19:28.823184 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l7hzl/crc-debug-7dnk9"] Sep 30 04:19:28 crc kubenswrapper[4744]: I0930 04:19:28.944451 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25mdj\" (UniqueName: \"kubernetes.io/projected/f44d71a3-fca9-465f-8138-8463dda58d72-kube-api-access-25mdj\") pod \"f44d71a3-fca9-465f-8138-8463dda58d72\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " Sep 30 04:19:28 crc kubenswrapper[4744]: I0930 04:19:28.944679 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44d71a3-fca9-465f-8138-8463dda58d72-host\") pod \"f44d71a3-fca9-465f-8138-8463dda58d72\" (UID: \"f44d71a3-fca9-465f-8138-8463dda58d72\") " Sep 30 04:19:28 crc kubenswrapper[4744]: I0930 04:19:28.945139 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f44d71a3-fca9-465f-8138-8463dda58d72-host" (OuterVolumeSpecName: "host") pod "f44d71a3-fca9-465f-8138-8463dda58d72" (UID: "f44d71a3-fca9-465f-8138-8463dda58d72"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 04:19:28 crc kubenswrapper[4744]: I0930 04:19:28.965690 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44d71a3-fca9-465f-8138-8463dda58d72-kube-api-access-25mdj" (OuterVolumeSpecName: "kube-api-access-25mdj") pod "f44d71a3-fca9-465f-8138-8463dda58d72" (UID: "f44d71a3-fca9-465f-8138-8463dda58d72"). InnerVolumeSpecName "kube-api-access-25mdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:19:29 crc kubenswrapper[4744]: I0930 04:19:29.046924 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25mdj\" (UniqueName: \"kubernetes.io/projected/f44d71a3-fca9-465f-8138-8463dda58d72-kube-api-access-25mdj\") on node \"crc\" DevicePath \"\"" Sep 30 04:19:29 crc kubenswrapper[4744]: I0930 04:19:29.046975 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f44d71a3-fca9-465f-8138-8463dda58d72-host\") on node \"crc\" DevicePath \"\"" Sep 30 04:19:29 crc kubenswrapper[4744]: I0930 04:19:29.519530 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44d71a3-fca9-465f-8138-8463dda58d72" path="/var/lib/kubelet/pods/f44d71a3-fca9-465f-8138-8463dda58d72/volumes" Sep 30 04:19:29 crc kubenswrapper[4744]: I0930 04:19:29.637590 4744 scope.go:117] "RemoveContainer" containerID="b8ecabd6c5fdd546f26a5df55ec9e9041867248e43efcee8e05539bc01eeed33" Sep 30 04:19:29 crc kubenswrapper[4744]: I0930 04:19:29.637663 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/crc-debug-7dnk9" Sep 30 04:19:30 crc kubenswrapper[4744]: I0930 04:19:30.615508 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nh8lm_464a78d3-19ea-4024-95f8-65c384a11de5/kube-rbac-proxy/0.log" Sep 30 04:19:30 crc kubenswrapper[4744]: I0930 04:19:30.695188 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nh8lm_464a78d3-19ea-4024-95f8-65c384a11de5/manager/0.log" Sep 30 04:19:30 crc kubenswrapper[4744]: I0930 04:19:30.815496 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lfgcl_48c24f9c-7ad2-4b16-8586-a98cc6f5745d/kube-rbac-proxy/0.log" Sep 30 04:19:30 crc kubenswrapper[4744]: I0930 04:19:30.894516 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lfgcl_48c24f9c-7ad2-4b16-8586-a98cc6f5745d/manager/0.log" Sep 30 04:19:30 crc kubenswrapper[4744]: I0930 04:19:30.948605 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-r4pn4_2754694b-4135-4439-ae89-dd08166467a5/kube-rbac-proxy/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.032204 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-r4pn4_2754694b-4135-4439-ae89-dd08166467a5/manager/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.116804 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/util/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.299851 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/util/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.317803 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/pull/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.344240 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/pull/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.556310 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/util/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.587519 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/pull/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.601799 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/extract/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.781946 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-f5wr4_7c60f5e8-9ac8-4729-9030-a17a74c66872/kube-rbac-proxy/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.827278 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-f5wr4_7c60f5e8-9ac8-4729-9030-a17a74c66872/manager/0.log" Sep 30 04:19:31 crc kubenswrapper[4744]: I0930 04:19:31.833246 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-jrs8k_c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad/kube-rbac-proxy/0.log" Sep 30 04:19:32 crc kubenswrapper[4744]: I0930 04:19:32.003578 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-jrmql_a6acae25-c5f3-4719-9a0d-866cef31aae8/kube-rbac-proxy/0.log" Sep 30 04:19:32 crc kubenswrapper[4744]: I0930 04:19:32.031012 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-jrs8k_c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad/manager/0.log" Sep 30 04:19:32 crc kubenswrapper[4744]: I0930 04:19:32.066687 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-jrmql_a6acae25-c5f3-4719-9a0d-866cef31aae8/manager/0.log" Sep 30 04:19:32 crc kubenswrapper[4744]: I0930 04:19:32.671145 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-z8f6l_a21e2f23-2adc-4f24-be18-72c39bb6ac8e/kube-rbac-proxy/0.log" Sep 30 04:19:32 crc kubenswrapper[4744]: I0930 04:19:32.854057 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-plwv5_1416686e-3057-4219-93e8-b6ed99e1b000/kube-rbac-proxy/0.log" Sep 30 04:19:32 crc kubenswrapper[4744]: I0930 04:19:32.896984 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-z8f6l_a21e2f23-2adc-4f24-be18-72c39bb6ac8e/manager/0.log" Sep 30 04:19:32 crc kubenswrapper[4744]: I0930 04:19:32.960230 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-plwv5_1416686e-3057-4219-93e8-b6ed99e1b000/manager/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.086104 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-n4bm8_befb38ef-208d-435f-820a-787301b3c4b8/kube-rbac-proxy/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.131966 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-n4bm8_befb38ef-208d-435f-820a-787301b3c4b8/manager/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.238625 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-xc7mx_edff3052-2bfd-47d9-be42-5d8f608fc529/kube-rbac-proxy/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.369310 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-xc7mx_edff3052-2bfd-47d9-be42-5d8f608fc529/manager/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.399099 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5whrj_7955a9b4-f81b-45cd-bc57-b96bef24b064/kube-rbac-proxy/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.460022 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5whrj_7955a9b4-f81b-45cd-bc57-b96bef24b064/manager/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.573953 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-hl4qt_adaa00a7-7a31-40ae-975e-47306e8128e8/kube-rbac-proxy/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.628866 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-hl4qt_adaa00a7-7a31-40ae-975e-47306e8128e8/manager/0.log" Sep 30 04:19:33 crc kubenswrapper[4744]: I0930 04:19:33.650981 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hjmhz_be907fa2-e5ce-461e-bad7-7ff67b7b28fc/kube-rbac-proxy/0.log" Sep 30 04:19:34 crc kubenswrapper[4744]: I0930 04:19:34.378189 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-zvxch_bcc255bc-d09d-4f16-b541-4e206fb39a80/kube-rbac-proxy/0.log" Sep 30 04:19:34 crc kubenswrapper[4744]: I0930 04:19:34.387742 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-zvxch_bcc255bc-d09d-4f16-b541-4e206fb39a80/manager/0.log" Sep 30 04:19:34 crc kubenswrapper[4744]: I0930 04:19:34.503913 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hjmhz_be907fa2-e5ce-461e-bad7-7ff67b7b28fc/manager/0.log" Sep 30 04:19:34 crc kubenswrapper[4744]: I0930 04:19:34.562080 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-gmkqb_2fdc94bc-95cf-4a16-a6cc-0d277f4969bc/kube-rbac-proxy/0.log" Sep 30 04:19:34 crc kubenswrapper[4744]: I0930 04:19:34.605666 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-gmkqb_2fdc94bc-95cf-4a16-a6cc-0d277f4969bc/manager/0.log" Sep 30 04:19:34 crc kubenswrapper[4744]: I0930 04:19:34.670684 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ff84bd547-bw5gx_1dc02df0-0d0c-49bc-b3e8-69efc93c3167/kube-rbac-proxy/0.log" Sep 30 04:19:34 crc kubenswrapper[4744]: I0930 04:19:34.794894 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d55cf86f4-4xvw5_fec734bd-0bd4-4e73-9d2d-cd6f0f002577/kube-rbac-proxy/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.010076 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d55cf86f4-4xvw5_fec734bd-0bd4-4e73-9d2d-cd6f0f002577/operator/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.036036 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2drwl_f429d57e-28b9-4f82-bb1f-494d295492d1/registry-server/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.243598 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-kd8v7_5c70f54b-6405-4dcc-a2d2-e989b2516f0e/kube-rbac-proxy/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.276425 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-kd8v7_5c70f54b-6405-4dcc-a2d2-e989b2516f0e/manager/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.351263 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-pcn79_015eb732-be5e-404f-81e2-b43d012c356b/kube-rbac-proxy/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.466224 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-pcn79_015eb732-be5e-404f-81e2-b43d012c356b/manager/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.495788 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-2vpmm_bb137b23-9366-4d2c-bc9d-ec50ccaef237/operator/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.661708 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-g6w7n_c46f8a8d-f07f-4983-9971-6b06d47c8e38/kube-rbac-proxy/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.687434 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-g6w7n_c46f8a8d-f07f-4983-9971-6b06d47c8e38/manager/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.736553 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5fx9s_5ae4d03d-68a5-498a-992f-df43dbeebc73/kube-rbac-proxy/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.801919 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ff84bd547-bw5gx_1dc02df0-0d0c-49bc-b3e8-69efc93c3167/manager/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.883735 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5fx9s_5ae4d03d-68a5-498a-992f-df43dbeebc73/manager/0.log" Sep 30 04:19:35 crc kubenswrapper[4744]: I0930 04:19:35.929495 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-p5zk9_5502e6e0-3d2f-479c-a53f-005bbb749631/kube-rbac-proxy/0.log" Sep 30 04:19:36 crc kubenswrapper[4744]: I0930 04:19:36.002979 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-p5zk9_5502e6e0-3d2f-479c-a53f-005bbb749631/manager/0.log" Sep 30 04:19:36 crc kubenswrapper[4744]: I0930 04:19:36.072819 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-pvzfj_6293cdef-44a9-4639-a40d-df02e9aa8410/kube-rbac-proxy/0.log" Sep 30 04:19:36 crc kubenswrapper[4744]: I0930 04:19:36.073170 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-pvzfj_6293cdef-44a9-4639-a40d-df02e9aa8410/manager/0.log" Sep 30 04:19:36 crc kubenswrapper[4744]: I0930 04:19:36.503554 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:19:36 crc kubenswrapper[4744]: E0930 04:19:36.503805 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:19:48 crc kubenswrapper[4744]: I0930 04:19:48.503884 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:19:48 crc kubenswrapper[4744]: E0930 04:19:48.506570 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:19:54 crc kubenswrapper[4744]: I0930 04:19:54.271179 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ssdfc_6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e/control-plane-machine-set-operator/0.log" Sep 30 04:19:54 crc kubenswrapper[4744]: I0930 04:19:54.416309 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxxmx_e771fd9b-4d78-4117-ac7c-40595fa5eb0b/kube-rbac-proxy/0.log" Sep 30 04:19:54 crc kubenswrapper[4744]: I0930 04:19:54.436578 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxxmx_e771fd9b-4d78-4117-ac7c-40595fa5eb0b/machine-api-operator/0.log" Sep 30 04:20:03 crc kubenswrapper[4744]: I0930 04:20:03.515629 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:20:03 crc kubenswrapper[4744]: E0930 04:20:03.516300 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:20:08 crc kubenswrapper[4744]: I0930 04:20:08.893487 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-tcddl_c692f12b-868a-4985-8c61-529463a4bbf5/cert-manager-controller/0.log" Sep 30 04:20:09 crc kubenswrapper[4744]: I0930 04:20:09.079286 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-fc4dx_05f345c2-2e42-4cf0-85f6-6a40551d51d7/cert-manager-webhook/0.log" Sep 30 04:20:09 crc kubenswrapper[4744]: I0930 04:20:09.093593 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-vhcl2_fa738a2a-d979-4352-82d3-ed7eb89e8fd9/cert-manager-cainjector/0.log" Sep 30 04:20:18 crc kubenswrapper[4744]: I0930 04:20:18.503708 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:20:18 crc kubenswrapper[4744]: E0930 04:20:18.504604 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:20:21 crc kubenswrapper[4744]: I0930 04:20:21.872434 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-kg924_c58a40af-7fd8-4a82-8109-855fbb1c32f3/nmstate-console-plugin/0.log" Sep 30 04:20:22 crc kubenswrapper[4744]: I0930 04:20:22.060888 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-57mt4_1b5d08ff-f7f3-4f08-a8be-dd45390037e4/nmstate-handler/0.log" Sep 30 04:20:22 crc kubenswrapper[4744]: I0930 04:20:22.092505 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tt4lr_ca163139-502b-44cc-ae53-83bc49866259/kube-rbac-proxy/0.log" Sep 30 04:20:22 crc kubenswrapper[4744]: I0930 04:20:22.148878 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tt4lr_ca163139-502b-44cc-ae53-83bc49866259/nmstate-metrics/0.log" Sep 30 04:20:22 crc kubenswrapper[4744]: I0930 04:20:22.300241 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-8lx9f_1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c/nmstate-operator/0.log" Sep 30 04:20:22 crc kubenswrapper[4744]: I0930 04:20:22.358578 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-lclrw_37a9886e-c2c0-46ab-a260-57231999e956/nmstate-webhook/0.log" Sep 30 04:20:29 crc kubenswrapper[4744]: I0930 04:20:29.504215 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:20:29 crc kubenswrapper[4744]: E0930 04:20:29.505350 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.108705 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-jg7r7_15204744-c1cf-4027-8131-fd89b0544638/kube-rbac-proxy/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.121643 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-jg7r7_15204744-c1cf-4027-8131-fd89b0544638/controller/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.262189 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.481125 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.495105 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.498035 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.540981 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.672393 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.683911 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.689806 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.767556 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.907943 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.914895 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:20:38 crc kubenswrapper[4744]: I0930 04:20:38.917934 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:20:39 crc kubenswrapper[4744]: I0930 04:20:39.692096 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/controller/0.log" Sep 30 04:20:39 crc kubenswrapper[4744]: I0930 04:20:39.748260 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/frr-metrics/0.log" Sep 30 04:20:39 crc kubenswrapper[4744]: I0930 04:20:39.762864 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/kube-rbac-proxy/0.log" Sep 30 04:20:39 crc kubenswrapper[4744]: I0930 04:20:39.919517 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/kube-rbac-proxy-frr/0.log" Sep 30 04:20:39 crc kubenswrapper[4744]: I0930 04:20:39.943483 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/reloader/0.log" Sep 30 04:20:40 crc kubenswrapper[4744]: I0930 04:20:40.204879 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-bwg9t_416e2fa8-29ae-42c2-a71a-863244e1b5df/frr-k8s-webhook-server/0.log" Sep 30 04:20:40 crc kubenswrapper[4744]: I0930 04:20:40.252239 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58d4cc4478-pt5sc_3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e/manager/0.log" Sep 30 04:20:40 crc kubenswrapper[4744]: I0930 04:20:40.592801 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-67f6c5dc78-c7h6t_cc149279-3823-4588-a559-a348efdb9bcd/webhook-server/0.log" Sep 30 04:20:40 crc kubenswrapper[4744]: I0930 04:20:40.716004 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dthlc_5ae230cf-d8e3-49d5-a336-fd028e0f5303/kube-rbac-proxy/0.log" Sep 30 04:20:40 crc kubenswrapper[4744]: I0930 04:20:40.906367 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/frr/0.log" Sep 30 04:20:41 crc kubenswrapper[4744]: I0930 04:20:41.190123 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dthlc_5ae230cf-d8e3-49d5-a336-fd028e0f5303/speaker/0.log" Sep 30 04:20:41 crc kubenswrapper[4744]: I0930 04:20:41.504030 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:20:41 crc kubenswrapper[4744]: E0930 04:20:41.504460 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:20:55 crc kubenswrapper[4744]: I0930 04:20:55.326718 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/util/0.log" Sep 30 04:20:55 crc kubenswrapper[4744]: I0930 04:20:55.757105 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/pull/0.log" Sep 30 04:20:55 crc kubenswrapper[4744]: I0930 04:20:55.786584 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/util/0.log" Sep 30 04:20:55 crc kubenswrapper[4744]: I0930 04:20:55.790395 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/pull/0.log" Sep 30 04:20:55 crc kubenswrapper[4744]: I0930 04:20:55.926847 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/util/0.log" Sep 30 04:20:55 crc kubenswrapper[4744]: I0930 04:20:55.937168 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/pull/0.log" Sep 30 04:20:55 crc kubenswrapper[4744]: I0930 04:20:55.993745 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/extract/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.111459 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-utilities/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.305331 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-content/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.320620 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-content/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.378222 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-utilities/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.503427 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:20:56 crc kubenswrapper[4744]: E0930 04:20:56.503710 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.523912 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-utilities/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.565171 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-content/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.763969 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-utilities/0.log" Sep 30 04:20:56 crc kubenswrapper[4744]: I0930 04:20:56.966286 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-content/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.055132 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-utilities/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.081526 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-content/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.180053 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/registry-server/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.243730 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-utilities/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.260853 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-content/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.513684 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/util/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.679650 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/util/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.720102 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/pull/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.747328 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/pull/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.924846 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/registry-server/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.943927 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/extract/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.956073 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/util/0.log" Sep 30 04:20:57 crc kubenswrapper[4744]: I0930 04:20:57.957222 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/pull/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.165970 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wkhkg_fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280/marketplace-operator/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.167941 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-utilities/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.343074 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-utilities/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.345230 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-content/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.351216 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-content/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.515098 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-utilities/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.552061 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-content/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.561880 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-utilities/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.724351 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/registry-server/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.812463 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-utilities/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.819464 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-content/0.log" Sep 30 04:20:58 crc kubenswrapper[4744]: I0930 04:20:58.849967 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-content/0.log" Sep 30 04:20:59 crc kubenswrapper[4744]: I0930 04:20:59.022684 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-content/0.log" Sep 30 04:20:59 crc kubenswrapper[4744]: I0930 04:20:59.023282 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-utilities/0.log" Sep 30 04:20:59 crc kubenswrapper[4744]: I0930 04:20:59.608441 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/registry-server/0.log" Sep 30 04:21:08 crc kubenswrapper[4744]: I0930 04:21:08.503885 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:21:08 crc kubenswrapper[4744]: E0930 04:21:08.504905 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:21:21 crc kubenswrapper[4744]: I0930 04:21:21.503670 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:21:21 crc kubenswrapper[4744]: E0930 04:21:21.504525 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:21:32 crc kubenswrapper[4744]: I0930 04:21:32.503489 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:21:32 crc kubenswrapper[4744]: E0930 04:21:32.504109 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:21:46 crc kubenswrapper[4744]: I0930 04:21:46.503707 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:21:46 crc kubenswrapper[4744]: E0930 04:21:46.505252 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:21:58 crc kubenswrapper[4744]: I0930 04:21:58.504267 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:21:58 crc kubenswrapper[4744]: E0930 04:21:58.505393 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:22:10 crc kubenswrapper[4744]: I0930 04:22:10.504723 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:22:11 crc kubenswrapper[4744]: I0930 04:22:11.462783 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"daf39e736917ca0a1b7fe85b78046b281e6dcd02d5f095b3a5da30b062ada306"} Sep 30 04:22:45 crc kubenswrapper[4744]: I0930 04:22:45.905356 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qzwlc"] Sep 30 04:22:45 crc kubenswrapper[4744]: E0930 04:22:45.906295 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44d71a3-fca9-465f-8138-8463dda58d72" containerName="container-00" Sep 30 04:22:45 crc kubenswrapper[4744]: I0930 04:22:45.906310 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44d71a3-fca9-465f-8138-8463dda58d72" containerName="container-00" Sep 30 04:22:45 crc kubenswrapper[4744]: I0930 04:22:45.906602 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44d71a3-fca9-465f-8138-8463dda58d72" containerName="container-00" Sep 30 04:22:45 crc kubenswrapper[4744]: I0930 04:22:45.907999 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:45 crc kubenswrapper[4744]: I0930 04:22:45.928560 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzwlc"] Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.004944 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-catalog-content\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.005046 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-utilities\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.005119 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74zjn\" (UniqueName: \"kubernetes.io/projected/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-kube-api-access-74zjn\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.107332 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-catalog-content\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.107413 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-utilities\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.107440 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74zjn\" (UniqueName: \"kubernetes.io/projected/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-kube-api-access-74zjn\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.107908 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-utilities\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.108161 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-catalog-content\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.132325 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74zjn\" (UniqueName: \"kubernetes.io/projected/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-kube-api-access-74zjn\") pod \"certified-operators-qzwlc\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.244560 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:46 crc kubenswrapper[4744]: I0930 04:22:46.739265 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzwlc"] Sep 30 04:22:47 crc kubenswrapper[4744]: W0930 04:22:47.563004 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb23aaa1_7e83_4d89_aaac_a894a9ffd006.slice/crio-72744667cb40a0c06302ef497e93276aeae0423108b0a5c1a481de2850f1d03a WatchSource:0}: Error finding container 72744667cb40a0c06302ef497e93276aeae0423108b0a5c1a481de2850f1d03a: Status 404 returned error can't find the container with id 72744667cb40a0c06302ef497e93276aeae0423108b0a5c1a481de2850f1d03a Sep 30 04:22:47 crc kubenswrapper[4744]: I0930 04:22:47.853311 4744 generic.go:334] "Generic (PLEG): container finished" podID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerID="383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a" exitCode=0 Sep 30 04:22:47 crc kubenswrapper[4744]: I0930 04:22:47.853407 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzwlc" event={"ID":"fb23aaa1-7e83-4d89-aaac-a894a9ffd006","Type":"ContainerDied","Data":"383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a"} Sep 30 04:22:47 crc kubenswrapper[4744]: I0930 04:22:47.853747 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzwlc" event={"ID":"fb23aaa1-7e83-4d89-aaac-a894a9ffd006","Type":"ContainerStarted","Data":"72744667cb40a0c06302ef497e93276aeae0423108b0a5c1a481de2850f1d03a"} Sep 30 04:22:47 crc kubenswrapper[4744]: I0930 04:22:47.856320 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 04:22:48 crc kubenswrapper[4744]: I0930 04:22:48.868823 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzwlc" event={"ID":"fb23aaa1-7e83-4d89-aaac-a894a9ffd006","Type":"ContainerStarted","Data":"fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd"} Sep 30 04:22:49 crc kubenswrapper[4744]: I0930 04:22:49.885776 4744 generic.go:334] "Generic (PLEG): container finished" podID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerID="fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd" exitCode=0 Sep 30 04:22:49 crc kubenswrapper[4744]: I0930 04:22:49.885882 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzwlc" event={"ID":"fb23aaa1-7e83-4d89-aaac-a894a9ffd006","Type":"ContainerDied","Data":"fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd"} Sep 30 04:22:50 crc kubenswrapper[4744]: I0930 04:22:50.902503 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzwlc" event={"ID":"fb23aaa1-7e83-4d89-aaac-a894a9ffd006","Type":"ContainerStarted","Data":"827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa"} Sep 30 04:22:50 crc kubenswrapper[4744]: I0930 04:22:50.942220 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qzwlc" podStartSLOduration=3.474247783 podStartE2EDuration="5.942190834s" podCreationTimestamp="2025-09-30 04:22:45 +0000 UTC" firstStartedPulling="2025-09-30 04:22:47.856047015 +0000 UTC m=+5295.029266989" lastFinishedPulling="2025-09-30 04:22:50.323990026 +0000 UTC m=+5297.497210040" observedRunningTime="2025-09-30 04:22:50.925288097 +0000 UTC m=+5298.098508111" watchObservedRunningTime="2025-09-30 04:22:50.942190834 +0000 UTC m=+5298.115410848" Sep 30 04:22:56 crc kubenswrapper[4744]: I0930 04:22:56.245259 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:56 crc kubenswrapper[4744]: I0930 04:22:56.246266 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:56 crc kubenswrapper[4744]: I0930 04:22:56.336728 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:57 crc kubenswrapper[4744]: I0930 04:22:57.058238 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:57 crc kubenswrapper[4744]: I0930 04:22:57.122517 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzwlc"] Sep 30 04:22:58 crc kubenswrapper[4744]: I0930 04:22:58.999048 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qzwlc" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="registry-server" containerID="cri-o://827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa" gracePeriod=2 Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.475708 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.655482 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-catalog-content\") pod \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.655732 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74zjn\" (UniqueName: \"kubernetes.io/projected/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-kube-api-access-74zjn\") pod \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.655797 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-utilities\") pod \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\" (UID: \"fb23aaa1-7e83-4d89-aaac-a894a9ffd006\") " Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.657504 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-utilities" (OuterVolumeSpecName: "utilities") pod "fb23aaa1-7e83-4d89-aaac-a894a9ffd006" (UID: "fb23aaa1-7e83-4d89-aaac-a894a9ffd006"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.669660 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-kube-api-access-74zjn" (OuterVolumeSpecName: "kube-api-access-74zjn") pod "fb23aaa1-7e83-4d89-aaac-a894a9ffd006" (UID: "fb23aaa1-7e83-4d89-aaac-a894a9ffd006"). InnerVolumeSpecName "kube-api-access-74zjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.719415 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb23aaa1-7e83-4d89-aaac-a894a9ffd006" (UID: "fb23aaa1-7e83-4d89-aaac-a894a9ffd006"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.758896 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.758933 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74zjn\" (UniqueName: \"kubernetes.io/projected/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-kube-api-access-74zjn\") on node \"crc\" DevicePath \"\"" Sep 30 04:22:59 crc kubenswrapper[4744]: I0930 04:22:59.758946 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb23aaa1-7e83-4d89-aaac-a894a9ffd006-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.014829 4744 generic.go:334] "Generic (PLEG): container finished" podID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerID="827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa" exitCode=0 Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.014874 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzwlc" event={"ID":"fb23aaa1-7e83-4d89-aaac-a894a9ffd006","Type":"ContainerDied","Data":"827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa"} Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.014953 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzwlc" event={"ID":"fb23aaa1-7e83-4d89-aaac-a894a9ffd006","Type":"ContainerDied","Data":"72744667cb40a0c06302ef497e93276aeae0423108b0a5c1a481de2850f1d03a"} Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.014979 4744 scope.go:117] "RemoveContainer" containerID="827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa" Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.014952 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzwlc" Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.068571 4744 scope.go:117] "RemoveContainer" containerID="fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd" Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.096704 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzwlc"] Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.111537 4744 scope.go:117] "RemoveContainer" containerID="383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a" Sep 30 04:23:00 crc kubenswrapper[4744]: I0930 04:23:00.112050 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qzwlc"] Sep 30 04:23:01 crc kubenswrapper[4744]: I0930 04:23:01.173701 4744 scope.go:117] "RemoveContainer" containerID="827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa" Sep 30 04:23:01 crc kubenswrapper[4744]: E0930 04:23:01.174494 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa\": container with ID starting with 827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa not found: ID does not exist" containerID="827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa" Sep 30 04:23:01 crc kubenswrapper[4744]: I0930 04:23:01.174538 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa"} err="failed to get container status \"827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa\": rpc error: code = NotFound desc = could not find container \"827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa\": container with ID starting with 827a52275d2523601a138eef40490fbf34ee97677c2f0e3e4c7f4da7b1a4e3fa not found: ID does not exist" Sep 30 04:23:01 crc kubenswrapper[4744]: I0930 04:23:01.174574 4744 scope.go:117] "RemoveContainer" containerID="fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd" Sep 30 04:23:01 crc kubenswrapper[4744]: E0930 04:23:01.175015 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd\": container with ID starting with fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd not found: ID does not exist" containerID="fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd" Sep 30 04:23:01 crc kubenswrapper[4744]: I0930 04:23:01.175056 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd"} err="failed to get container status \"fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd\": rpc error: code = NotFound desc = could not find container \"fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd\": container with ID starting with fe1478149bef6836c9a111a6e8c2f0929cfeca68ddf20e035519768142c190bd not found: ID does not exist" Sep 30 04:23:01 crc kubenswrapper[4744]: I0930 04:23:01.175081 4744 scope.go:117] "RemoveContainer" containerID="383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a" Sep 30 04:23:01 crc kubenswrapper[4744]: E0930 04:23:01.175793 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a\": container with ID starting with 383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a not found: ID does not exist" containerID="383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a" Sep 30 04:23:01 crc kubenswrapper[4744]: I0930 04:23:01.175838 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a"} err="failed to get container status \"383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a\": rpc error: code = NotFound desc = could not find container \"383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a\": container with ID starting with 383997fd7f659a6545b077f873276539d021217e3bc7e8c21e38845b9bcf8c4a not found: ID does not exist" Sep 30 04:23:01 crc kubenswrapper[4744]: I0930 04:23:01.521526 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" path="/var/lib/kubelet/pods/fb23aaa1-7e83-4d89-aaac-a894a9ffd006/volumes" Sep 30 04:23:19 crc kubenswrapper[4744]: I0930 04:23:19.268278 4744 generic.go:334] "Generic (PLEG): container finished" podID="cc11d67a-25a6-4242-ab75-e6820419a269" containerID="b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18" exitCode=0 Sep 30 04:23:19 crc kubenswrapper[4744]: I0930 04:23:19.268910 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l7hzl/must-gather-gpgls" event={"ID":"cc11d67a-25a6-4242-ab75-e6820419a269","Type":"ContainerDied","Data":"b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18"} Sep 30 04:23:19 crc kubenswrapper[4744]: I0930 04:23:19.269779 4744 scope.go:117] "RemoveContainer" containerID="b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18" Sep 30 04:23:19 crc kubenswrapper[4744]: I0930 04:23:19.360581 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l7hzl_must-gather-gpgls_cc11d67a-25a6-4242-ab75-e6820419a269/gather/0.log" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.912253 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hjrrk"] Sep 30 04:23:25 crc kubenswrapper[4744]: E0930 04:23:25.913777 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="extract-content" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.913816 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="extract-content" Sep 30 04:23:25 crc kubenswrapper[4744]: E0930 04:23:25.913839 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="extract-utilities" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.913857 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="extract-utilities" Sep 30 04:23:25 crc kubenswrapper[4744]: E0930 04:23:25.913918 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="registry-server" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.913937 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="registry-server" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.914501 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb23aaa1-7e83-4d89-aaac-a894a9ffd006" containerName="registry-server" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.922192 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.945365 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjrrk"] Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.946115 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lncqr\" (UniqueName: \"kubernetes.io/projected/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-kube-api-access-lncqr\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.946212 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-catalog-content\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:25 crc kubenswrapper[4744]: I0930 04:23:25.946309 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-utilities\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:26 crc kubenswrapper[4744]: I0930 04:23:26.047996 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-catalog-content\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:26 crc kubenswrapper[4744]: I0930 04:23:26.048139 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-utilities\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:26 crc kubenswrapper[4744]: I0930 04:23:26.048259 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lncqr\" (UniqueName: \"kubernetes.io/projected/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-kube-api-access-lncqr\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:26 crc kubenswrapper[4744]: I0930 04:23:26.048704 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-catalog-content\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:26 crc kubenswrapper[4744]: I0930 04:23:26.049094 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-utilities\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:26 crc kubenswrapper[4744]: I0930 04:23:26.070474 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lncqr\" (UniqueName: \"kubernetes.io/projected/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-kube-api-access-lncqr\") pod \"community-operators-hjrrk\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:26 crc kubenswrapper[4744]: I0930 04:23:26.247850 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:27 crc kubenswrapper[4744]: I0930 04:23:27.102439 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjrrk"] Sep 30 04:23:27 crc kubenswrapper[4744]: I0930 04:23:27.379510 4744 generic.go:334] "Generic (PLEG): container finished" podID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerID="d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607" exitCode=0 Sep 30 04:23:27 crc kubenswrapper[4744]: I0930 04:23:27.379576 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrrk" event={"ID":"4106a90e-f1d8-48bc-9902-8e363d0aaa7b","Type":"ContainerDied","Data":"d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607"} Sep 30 04:23:27 crc kubenswrapper[4744]: I0930 04:23:27.379854 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrrk" event={"ID":"4106a90e-f1d8-48bc-9902-8e363d0aaa7b","Type":"ContainerStarted","Data":"324873d4f10a9dc0c15575166c1975ea5ad5f33ef3a036adc55d3d4bbf38a491"} Sep 30 04:23:28 crc kubenswrapper[4744]: I0930 04:23:28.389898 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrrk" event={"ID":"4106a90e-f1d8-48bc-9902-8e363d0aaa7b","Type":"ContainerStarted","Data":"814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652"} Sep 30 04:23:28 crc kubenswrapper[4744]: I0930 04:23:28.836291 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l7hzl/must-gather-gpgls"] Sep 30 04:23:28 crc kubenswrapper[4744]: I0930 04:23:28.836805 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-l7hzl/must-gather-gpgls" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" containerName="copy" containerID="cri-o://cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400" gracePeriod=2 Sep 30 04:23:28 crc kubenswrapper[4744]: I0930 04:23:28.848811 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l7hzl/must-gather-gpgls"] Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.280558 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l7hzl_must-gather-gpgls_cc11d67a-25a6-4242-ab75-e6820419a269/copy/0.log" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.281131 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.316837 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc11d67a-25a6-4242-ab75-e6820419a269-must-gather-output\") pod \"cc11d67a-25a6-4242-ab75-e6820419a269\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.316918 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm4k8\" (UniqueName: \"kubernetes.io/projected/cc11d67a-25a6-4242-ab75-e6820419a269-kube-api-access-rm4k8\") pod \"cc11d67a-25a6-4242-ab75-e6820419a269\" (UID: \"cc11d67a-25a6-4242-ab75-e6820419a269\") " Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.322035 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc11d67a-25a6-4242-ab75-e6820419a269-kube-api-access-rm4k8" (OuterVolumeSpecName: "kube-api-access-rm4k8") pod "cc11d67a-25a6-4242-ab75-e6820419a269" (UID: "cc11d67a-25a6-4242-ab75-e6820419a269"). InnerVolumeSpecName "kube-api-access-rm4k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.398534 4744 generic.go:334] "Generic (PLEG): container finished" podID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerID="814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652" exitCode=0 Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.398581 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrrk" event={"ID":"4106a90e-f1d8-48bc-9902-8e363d0aaa7b","Type":"ContainerDied","Data":"814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652"} Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.399993 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l7hzl_must-gather-gpgls_cc11d67a-25a6-4242-ab75-e6820419a269/copy/0.log" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.401125 4744 generic.go:334] "Generic (PLEG): container finished" podID="cc11d67a-25a6-4242-ab75-e6820419a269" containerID="cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400" exitCode=143 Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.401193 4744 scope.go:117] "RemoveContainer" containerID="cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.401293 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l7hzl/must-gather-gpgls" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.420013 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm4k8\" (UniqueName: \"kubernetes.io/projected/cc11d67a-25a6-4242-ab75-e6820419a269-kube-api-access-rm4k8\") on node \"crc\" DevicePath \"\"" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.423037 4744 scope.go:117] "RemoveContainer" containerID="b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.512693 4744 scope.go:117] "RemoveContainer" containerID="cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400" Sep 30 04:23:29 crc kubenswrapper[4744]: E0930 04:23:29.514922 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400\": container with ID starting with cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400 not found: ID does not exist" containerID="cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.514967 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400"} err="failed to get container status \"cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400\": rpc error: code = NotFound desc = could not find container \"cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400\": container with ID starting with cea5faf4716152199e52740c484ee42bcdc35a47c4485cb5328aab8d7dfd3400 not found: ID does not exist" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.514995 4744 scope.go:117] "RemoveContainer" containerID="b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18" Sep 30 04:23:29 crc kubenswrapper[4744]: E0930 04:23:29.515290 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18\": container with ID starting with b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18 not found: ID does not exist" containerID="b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.515331 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18"} err="failed to get container status \"b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18\": rpc error: code = NotFound desc = could not find container \"b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18\": container with ID starting with b1abfd6bab2fbbfdf07f1825f911ba17f11fe7bd392bc0614c8b881fd2914e18 not found: ID does not exist" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.525114 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc11d67a-25a6-4242-ab75-e6820419a269-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "cc11d67a-25a6-4242-ab75-e6820419a269" (UID: "cc11d67a-25a6-4242-ab75-e6820419a269"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:23:29 crc kubenswrapper[4744]: I0930 04:23:29.623577 4744 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cc11d67a-25a6-4242-ab75-e6820419a269-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 04:23:30 crc kubenswrapper[4744]: I0930 04:23:30.412251 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrrk" event={"ID":"4106a90e-f1d8-48bc-9902-8e363d0aaa7b","Type":"ContainerStarted","Data":"8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa"} Sep 30 04:23:30 crc kubenswrapper[4744]: I0930 04:23:30.444960 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hjrrk" podStartSLOduration=3.002651309 podStartE2EDuration="5.44494228s" podCreationTimestamp="2025-09-30 04:23:25 +0000 UTC" firstStartedPulling="2025-09-30 04:23:27.38156759 +0000 UTC m=+5334.554787564" lastFinishedPulling="2025-09-30 04:23:29.823858541 +0000 UTC m=+5336.997078535" observedRunningTime="2025-09-30 04:23:30.438388485 +0000 UTC m=+5337.611608469" watchObservedRunningTime="2025-09-30 04:23:30.44494228 +0000 UTC m=+5337.618162254" Sep 30 04:23:31 crc kubenswrapper[4744]: I0930 04:23:31.525208 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" path="/var/lib/kubelet/pods/cc11d67a-25a6-4242-ab75-e6820419a269/volumes" Sep 30 04:23:36 crc kubenswrapper[4744]: I0930 04:23:36.248526 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:36 crc kubenswrapper[4744]: I0930 04:23:36.248973 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:36 crc kubenswrapper[4744]: I0930 04:23:36.339220 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:36 crc kubenswrapper[4744]: I0930 04:23:36.565460 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:36 crc kubenswrapper[4744]: I0930 04:23:36.634169 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjrrk"] Sep 30 04:23:38 crc kubenswrapper[4744]: I0930 04:23:38.507362 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hjrrk" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="registry-server" containerID="cri-o://8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa" gracePeriod=2 Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.062340 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.179905 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lncqr\" (UniqueName: \"kubernetes.io/projected/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-kube-api-access-lncqr\") pod \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.180011 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-catalog-content\") pod \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.180076 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-utilities\") pod \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\" (UID: \"4106a90e-f1d8-48bc-9902-8e363d0aaa7b\") " Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.181612 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-utilities" (OuterVolumeSpecName: "utilities") pod "4106a90e-f1d8-48bc-9902-8e363d0aaa7b" (UID: "4106a90e-f1d8-48bc-9902-8e363d0aaa7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.185909 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-kube-api-access-lncqr" (OuterVolumeSpecName: "kube-api-access-lncqr") pod "4106a90e-f1d8-48bc-9902-8e363d0aaa7b" (UID: "4106a90e-f1d8-48bc-9902-8e363d0aaa7b"). InnerVolumeSpecName "kube-api-access-lncqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.233009 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4106a90e-f1d8-48bc-9902-8e363d0aaa7b" (UID: "4106a90e-f1d8-48bc-9902-8e363d0aaa7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.282728 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.282760 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.282770 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lncqr\" (UniqueName: \"kubernetes.io/projected/4106a90e-f1d8-48bc-9902-8e363d0aaa7b-kube-api-access-lncqr\") on node \"crc\" DevicePath \"\"" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.524147 4744 generic.go:334] "Generic (PLEG): container finished" podID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerID="8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa" exitCode=0 Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.524213 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrrk" event={"ID":"4106a90e-f1d8-48bc-9902-8e363d0aaa7b","Type":"ContainerDied","Data":"8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa"} Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.524247 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjrrk" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.524268 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjrrk" event={"ID":"4106a90e-f1d8-48bc-9902-8e363d0aaa7b","Type":"ContainerDied","Data":"324873d4f10a9dc0c15575166c1975ea5ad5f33ef3a036adc55d3d4bbf38a491"} Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.524301 4744 scope.go:117] "RemoveContainer" containerID="8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.561185 4744 scope.go:117] "RemoveContainer" containerID="814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.591016 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjrrk"] Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.600361 4744 scope.go:117] "RemoveContainer" containerID="d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.603347 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hjrrk"] Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.668041 4744 scope.go:117] "RemoveContainer" containerID="8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa" Sep 30 04:23:39 crc kubenswrapper[4744]: E0930 04:23:39.668681 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa\": container with ID starting with 8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa not found: ID does not exist" containerID="8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.668747 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa"} err="failed to get container status \"8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa\": rpc error: code = NotFound desc = could not find container \"8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa\": container with ID starting with 8b5f4db78cf6bbb3d576e56b6853e1dba6c793030ca6e8ae3b6cda9134c19caa not found: ID does not exist" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.668787 4744 scope.go:117] "RemoveContainer" containerID="814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652" Sep 30 04:23:39 crc kubenswrapper[4744]: E0930 04:23:39.669505 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652\": container with ID starting with 814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652 not found: ID does not exist" containerID="814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.669548 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652"} err="failed to get container status \"814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652\": rpc error: code = NotFound desc = could not find container \"814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652\": container with ID starting with 814c511aca60b46e204a8becbafce76290c619b804bd952d1a22d66ff4c2d652 not found: ID does not exist" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.669635 4744 scope.go:117] "RemoveContainer" containerID="d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607" Sep 30 04:23:39 crc kubenswrapper[4744]: E0930 04:23:39.670071 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607\": container with ID starting with d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607 not found: ID does not exist" containerID="d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607" Sep 30 04:23:39 crc kubenswrapper[4744]: I0930 04:23:39.670133 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607"} err="failed to get container status \"d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607\": rpc error: code = NotFound desc = could not find container \"d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607\": container with ID starting with d56a40debd3872ee79be4209e1a9e62f46157e06280bf8370b3273405bcbf607 not found: ID does not exist" Sep 30 04:23:41 crc kubenswrapper[4744]: I0930 04:23:41.521128 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" path="/var/lib/kubelet/pods/4106a90e-f1d8-48bc-9902-8e363d0aaa7b/volumes" Sep 30 04:23:43 crc kubenswrapper[4744]: I0930 04:23:43.413868 4744 scope.go:117] "RemoveContainer" containerID="899a3a4e1b091e49790cd500b6ec90a076752cf4c589bd8b4ee2232df6911063" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.776557 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l8km5/must-gather-b5qvd"] Sep 30 04:24:13 crc kubenswrapper[4744]: E0930 04:24:13.777750 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="extract-content" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.777772 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="extract-content" Sep 30 04:24:13 crc kubenswrapper[4744]: E0930 04:24:13.777789 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" containerName="copy" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.777802 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" containerName="copy" Sep 30 04:24:13 crc kubenswrapper[4744]: E0930 04:24:13.777852 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="extract-utilities" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.777864 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="extract-utilities" Sep 30 04:24:13 crc kubenswrapper[4744]: E0930 04:24:13.777889 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="registry-server" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.777899 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="registry-server" Sep 30 04:24:13 crc kubenswrapper[4744]: E0930 04:24:13.777921 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" containerName="gather" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.777931 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" containerName="gather" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.778230 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" containerName="copy" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.778270 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="4106a90e-f1d8-48bc-9902-8e363d0aaa7b" containerName="registry-server" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.778304 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc11d67a-25a6-4242-ab75-e6820419a269" containerName="gather" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.780697 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.782634 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-l8km5"/"default-dockercfg-fq7p2" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.782790 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l8km5"/"openshift-service-ca.crt" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.782856 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-l8km5"/"kube-root-ca.crt" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.803869 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l8km5/must-gather-b5qvd"] Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.983449 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c175e32d-a72c-42cd-979b-9ced3a98d7f8-must-gather-output\") pod \"must-gather-b5qvd\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:13 crc kubenswrapper[4744]: I0930 04:24:13.983979 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f996\" (UniqueName: \"kubernetes.io/projected/c175e32d-a72c-42cd-979b-9ced3a98d7f8-kube-api-access-5f996\") pod \"must-gather-b5qvd\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.086149 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c175e32d-a72c-42cd-979b-9ced3a98d7f8-must-gather-output\") pod \"must-gather-b5qvd\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.086279 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f996\" (UniqueName: \"kubernetes.io/projected/c175e32d-a72c-42cd-979b-9ced3a98d7f8-kube-api-access-5f996\") pod \"must-gather-b5qvd\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.086634 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c175e32d-a72c-42cd-979b-9ced3a98d7f8-must-gather-output\") pod \"must-gather-b5qvd\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.104927 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f996\" (UniqueName: \"kubernetes.io/projected/c175e32d-a72c-42cd-979b-9ced3a98d7f8-kube-api-access-5f996\") pod \"must-gather-b5qvd\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.111105 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.531612 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l8km5/must-gather-b5qvd"] Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.943807 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/must-gather-b5qvd" event={"ID":"c175e32d-a72c-42cd-979b-9ced3a98d7f8","Type":"ContainerStarted","Data":"af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5"} Sep 30 04:24:14 crc kubenswrapper[4744]: I0930 04:24:14.944075 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/must-gather-b5qvd" event={"ID":"c175e32d-a72c-42cd-979b-9ced3a98d7f8","Type":"ContainerStarted","Data":"017bb5bc2857b880b3c3f0912bb6589432eb847247d3c08da714cb97d2896d9b"} Sep 30 04:24:15 crc kubenswrapper[4744]: I0930 04:24:15.959274 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/must-gather-b5qvd" event={"ID":"c175e32d-a72c-42cd-979b-9ced3a98d7f8","Type":"ContainerStarted","Data":"1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023"} Sep 30 04:24:15 crc kubenswrapper[4744]: I0930 04:24:15.971852 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l8km5/must-gather-b5qvd" podStartSLOduration=2.971837645 podStartE2EDuration="2.971837645s" podCreationTimestamp="2025-09-30 04:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 04:24:15.969805511 +0000 UTC m=+5383.143025485" watchObservedRunningTime="2025-09-30 04:24:15.971837645 +0000 UTC m=+5383.145057619" Sep 30 04:24:16 crc kubenswrapper[4744]: E0930 04:24:16.941557 4744 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:47750->38.102.83.51:40449: write tcp 38.102.83.51:47750->38.102.83.51:40449: write: broken pipe Sep 30 04:24:18 crc kubenswrapper[4744]: I0930 04:24:18.840552 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l8km5/crc-debug-hxhhr"] Sep 30 04:24:18 crc kubenswrapper[4744]: I0930 04:24:18.843995 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:18 crc kubenswrapper[4744]: I0930 04:24:18.971263 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/209bc69c-4023-41c6-8c8e-f9582de3cc0a-host\") pod \"crc-debug-hxhhr\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:18 crc kubenswrapper[4744]: I0930 04:24:18.971348 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zljxd\" (UniqueName: \"kubernetes.io/projected/209bc69c-4023-41c6-8c8e-f9582de3cc0a-kube-api-access-zljxd\") pod \"crc-debug-hxhhr\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:19 crc kubenswrapper[4744]: I0930 04:24:19.073098 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/209bc69c-4023-41c6-8c8e-f9582de3cc0a-host\") pod \"crc-debug-hxhhr\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:19 crc kubenswrapper[4744]: I0930 04:24:19.073174 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zljxd\" (UniqueName: \"kubernetes.io/projected/209bc69c-4023-41c6-8c8e-f9582de3cc0a-kube-api-access-zljxd\") pod \"crc-debug-hxhhr\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:19 crc kubenswrapper[4744]: I0930 04:24:19.073555 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/209bc69c-4023-41c6-8c8e-f9582de3cc0a-host\") pod \"crc-debug-hxhhr\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:19 crc kubenswrapper[4744]: I0930 04:24:19.090627 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zljxd\" (UniqueName: \"kubernetes.io/projected/209bc69c-4023-41c6-8c8e-f9582de3cc0a-kube-api-access-zljxd\") pod \"crc-debug-hxhhr\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:19 crc kubenswrapper[4744]: I0930 04:24:19.160500 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:24:19 crc kubenswrapper[4744]: I0930 04:24:19.996612 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" event={"ID":"209bc69c-4023-41c6-8c8e-f9582de3cc0a","Type":"ContainerStarted","Data":"33b3188c662cd94b82dd12358069eb704a3929b44b14e33a12093024f3de486d"} Sep 30 04:24:19 crc kubenswrapper[4744]: I0930 04:24:19.997058 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" event={"ID":"209bc69c-4023-41c6-8c8e-f9582de3cc0a","Type":"ContainerStarted","Data":"7d5bdfc64dfd16da760b3be4fde065153c553df6ad17a88dc5f9cbb33683b7c7"} Sep 30 04:24:20 crc kubenswrapper[4744]: I0930 04:24:20.010859 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" podStartSLOduration=2.010834098 podStartE2EDuration="2.010834098s" podCreationTimestamp="2025-09-30 04:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 04:24:20.010161688 +0000 UTC m=+5387.183381662" watchObservedRunningTime="2025-09-30 04:24:20.010834098 +0000 UTC m=+5387.184054072" Sep 30 04:24:34 crc kubenswrapper[4744]: I0930 04:24:34.348290 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:24:34 crc kubenswrapper[4744]: I0930 04:24:34.350119 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:25:04 crc kubenswrapper[4744]: I0930 04:25:04.348003 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:25:04 crc kubenswrapper[4744]: I0930 04:25:04.348580 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:25:27 crc kubenswrapper[4744]: I0930 04:25:27.139651 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b554c468b-9gtqj_fbf5b5e2-d32c-4714-864b-06e2f15dd3ce/barbican-api/0.log" Sep 30 04:25:27 crc kubenswrapper[4744]: I0930 04:25:27.167277 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b554c468b-9gtqj_fbf5b5e2-d32c-4714-864b-06e2f15dd3ce/barbican-api-log/0.log" Sep 30 04:25:27 crc kubenswrapper[4744]: I0930 04:25:27.364322 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-578ccf57db-dnd4k_89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e/barbican-keystone-listener/0.log" Sep 30 04:25:27 crc kubenswrapper[4744]: I0930 04:25:27.646721 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58944b8f99-bl9hx_fb5969c0-4230-4813-9009-546eda8657eb/barbican-worker/0.log" Sep 30 04:25:27 crc kubenswrapper[4744]: I0930 04:25:27.808559 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58944b8f99-bl9hx_fb5969c0-4230-4813-9009-546eda8657eb/barbican-worker-log/0.log" Sep 30 04:25:28 crc kubenswrapper[4744]: I0930 04:25:28.087498 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5gcxq_2abd1aec-872e-4bcb-a05f-c0d04d689489/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:28 crc kubenswrapper[4744]: I0930 04:25:28.175655 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-578ccf57db-dnd4k_89d0d6d3-9ea4-4d48-b7e9-3dfcfc4ba56e/barbican-keystone-listener-log/0.log" Sep 30 04:25:28 crc kubenswrapper[4744]: I0930 04:25:28.575630 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/ceilometer-central-agent/0.log" Sep 30 04:25:28 crc kubenswrapper[4744]: I0930 04:25:28.599597 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/proxy-httpd/0.log" Sep 30 04:25:28 crc kubenswrapper[4744]: I0930 04:25:28.629376 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/ceilometer-notification-agent/0.log" Sep 30 04:25:28 crc kubenswrapper[4744]: I0930 04:25:28.818754 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea2be999-3323-4e60-b44e-641418e67b04/sg-core/0.log" Sep 30 04:25:29 crc kubenswrapper[4744]: I0930 04:25:29.025733 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_800e9149-6d7e-4196-bad2-e747131c3e34/ceph/0.log" Sep 30 04:25:29 crc kubenswrapper[4744]: I0930 04:25:29.273063 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5269829d-b1f7-4980-9550-d622fa40c1f1/cinder-api/0.log" Sep 30 04:25:29 crc kubenswrapper[4744]: I0930 04:25:29.416354 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5269829d-b1f7-4980-9550-d622fa40c1f1/cinder-api-log/0.log" Sep 30 04:25:29 crc kubenswrapper[4744]: I0930 04:25:29.671382 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6cc7863-b10e-47a4-bd86-5c66436d4af4/probe/0.log" Sep 30 04:25:29 crc kubenswrapper[4744]: I0930 04:25:29.929855 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1ba32e4a-2e93-4483-9acf-a7a72792b0f6/cinder-scheduler/0.log" Sep 30 04:25:29 crc kubenswrapper[4744]: I0930 04:25:29.969127 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1ba32e4a-2e93-4483-9acf-a7a72792b0f6/probe/0.log" Sep 30 04:25:30 crc kubenswrapper[4744]: I0930 04:25:30.469348 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e6cc7863-b10e-47a4-bd86-5c66436d4af4/cinder-backup/0.log" Sep 30 04:25:30 crc kubenswrapper[4744]: I0930 04:25:30.479710 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_597b8dc3-9c8f-48c4-b554-7d8564395142/probe/0.log" Sep 30 04:25:30 crc kubenswrapper[4744]: I0930 04:25:30.679489 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-zjt8n_bb8619b8-471f-4b9c-a9ee-97f668713bec/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:30 crc kubenswrapper[4744]: I0930 04:25:30.722069 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pp49d_3c88be7c-d782-4d4c-9110-997c89d8261e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:30 crc kubenswrapper[4744]: I0930 04:25:30.882899 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-2fpz5_166b326f-c29c-48e9-b017-034c02b4d448/init/0.log" Sep 30 04:25:31 crc kubenswrapper[4744]: I0930 04:25:31.096061 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-2fpz5_166b326f-c29c-48e9-b017-034c02b4d448/init/0.log" Sep 30 04:25:31 crc kubenswrapper[4744]: I0930 04:25:31.326185 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-2fpz5_166b326f-c29c-48e9-b017-034c02b4d448/dnsmasq-dns/0.log" Sep 30 04:25:31 crc kubenswrapper[4744]: I0930 04:25:31.408296 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8qnx2_1ec7b740-1236-48b8-9aa5-0fd0c2f64380/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:31 crc kubenswrapper[4744]: I0930 04:25:31.598947 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6a6b749-14c4-4726-b176-160667e2651d/glance-log/0.log" Sep 30 04:25:31 crc kubenswrapper[4744]: I0930 04:25:31.629856 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6a6b749-14c4-4726-b176-160667e2651d/glance-httpd/0.log" Sep 30 04:25:31 crc kubenswrapper[4744]: I0930 04:25:31.824116 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa/glance-httpd/0.log" Sep 30 04:25:31 crc kubenswrapper[4744]: I0930 04:25:31.833270 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8e1e9f2b-bdae-4a3c-9f05-b8a86d90acaa/glance-log/0.log" Sep 30 04:25:32 crc kubenswrapper[4744]: I0930 04:25:32.191967 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78db449746-kg7zl_ff31735f-472e-4b3a-8d81-bc5c392aec09/horizon/0.log" Sep 30 04:25:32 crc kubenswrapper[4744]: I0930 04:25:32.347727 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-4thcz_de456177-d85a-41d5-aa9f-f7d7d6f68e21/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:32 crc kubenswrapper[4744]: I0930 04:25:32.544305 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-frfr6_8a000c1d-f61a-4bb0-8041-acf07914d4de/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:32 crc kubenswrapper[4744]: I0930 04:25:32.884069 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_597b8dc3-9c8f-48c4-b554-7d8564395142/cinder-volume/0.log" Sep 30 04:25:32 crc kubenswrapper[4744]: I0930 04:25:32.925459 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78db449746-kg7zl_ff31735f-472e-4b3a-8d81-bc5c392aec09/horizon-log/0.log" Sep 30 04:25:33 crc kubenswrapper[4744]: I0930 04:25:33.056614 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320081-57dhj_6dcc7d67-05ee-4d4d-b8ef-b869fb38fbec/keystone-cron/0.log" Sep 30 04:25:33 crc kubenswrapper[4744]: I0930 04:25:33.121362 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_825e7c08-c607-429e-bf96-d8c332d03cd1/kube-state-metrics/0.log" Sep 30 04:25:33 crc kubenswrapper[4744]: I0930 04:25:33.381317 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mfdfh_fc1867d3-bb6f-4fca-876d-b868bcd284bb/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.269881 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a1d320da-1463-4d51-beff-da49872cdb35/manila-api/0.log" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.319763 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_34aed00c-8bca-400a-bea5-1e7966a35388/manila-scheduler/0.log" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.347244 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.347302 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.347349 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.348172 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"daf39e736917ca0a1b7fe85b78046b281e6dcd02d5f095b3a5da30b062ada306"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.348233 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://daf39e736917ca0a1b7fe85b78046b281e6dcd02d5f095b3a5da30b062ada306" gracePeriod=600 Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.551339 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_34aed00c-8bca-400a-bea5-1e7966a35388/probe/0.log" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.731244 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a1d320da-1463-4d51-beff-da49872cdb35/manila-api-log/0.log" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.755068 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="daf39e736917ca0a1b7fe85b78046b281e6dcd02d5f095b3a5da30b062ada306" exitCode=0 Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.755174 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"daf39e736917ca0a1b7fe85b78046b281e6dcd02d5f095b3a5da30b062ada306"} Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.756089 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerStarted","Data":"3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15"} Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.756911 4744 scope.go:117] "RemoveContainer" containerID="760cdb206bb2ca77b24d0fc49d10f9607d1163f6aaba02618c3b022fa40b90f1" Sep 30 04:25:34 crc kubenswrapper[4744]: I0930 04:25:34.863759 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8f4d1853-2bfc-4470-be87-65c81ff45b97/probe/0.log" Sep 30 04:25:35 crc kubenswrapper[4744]: I0930 04:25:35.204529 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_8f4d1853-2bfc-4470-be87-65c81ff45b97/manila-share/0.log" Sep 30 04:25:35 crc kubenswrapper[4744]: I0930 04:25:35.740739 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-59ddc4db88-d9q99_02356cb4-2497-483a-9742-acd6b9080dc2/keystone-api/0.log" Sep 30 04:25:36 crc kubenswrapper[4744]: I0930 04:25:36.110767 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5578f9874f-7lb9c_ecbf3c72-f1cb-48fd-8823-3d3ae2040c86/neutron-httpd/0.log" Sep 30 04:25:36 crc kubenswrapper[4744]: I0930 04:25:36.296993 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bb5rb_ced06625-11b0-4e49-9874-9f627107037c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:36 crc kubenswrapper[4744]: I0930 04:25:36.556700 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5578f9874f-7lb9c_ecbf3c72-f1cb-48fd-8823-3d3ae2040c86/neutron-api/0.log" Sep 30 04:25:37 crc kubenswrapper[4744]: I0930 04:25:37.583069 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9d37d81d-59fb-4686-b8b9-34ba95b98cb2/nova-cell0-conductor-conductor/0.log" Sep 30 04:25:38 crc kubenswrapper[4744]: I0930 04:25:38.233051 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_45c58372-9d54-41ad-8059-5666ff3ab3c6/nova-cell1-conductor-conductor/0.log" Sep 30 04:25:38 crc kubenswrapper[4744]: I0930 04:25:38.671830 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5a7ff737-dbb5-4e5c-9862-6b99f8584fc4/nova-api-log/0.log" Sep 30 04:25:38 crc kubenswrapper[4744]: I0930 04:25:38.870941 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_843d7ca4-8741-4c46-9e24-c432261d5c57/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 04:25:39 crc kubenswrapper[4744]: I0930 04:25:39.164354 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qj5zd_a0abbf7a-4bd2-4a60-a571-68eae4ea321c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:39 crc kubenswrapper[4744]: I0930 04:25:39.389613 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9aebe30-d132-461b-ad9b-fa6bc9f1227b/nova-metadata-log/0.log" Sep 30 04:25:39 crc kubenswrapper[4744]: I0930 04:25:39.526051 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5a7ff737-dbb5-4e5c-9862-6b99f8584fc4/nova-api-api/0.log" Sep 30 04:25:40 crc kubenswrapper[4744]: I0930 04:25:40.029065 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c1131b4e-532d-478b-bbd8-b52963f60462/mysql-bootstrap/0.log" Sep 30 04:25:40 crc kubenswrapper[4744]: I0930 04:25:40.268824 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c1131b4e-532d-478b-bbd8-b52963f60462/mysql-bootstrap/0.log" Sep 30 04:25:40 crc kubenswrapper[4744]: I0930 04:25:40.282422 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_897dfc3d-b2fa-4a22-b5a9-e2ce2c486801/nova-scheduler-scheduler/0.log" Sep 30 04:25:40 crc kubenswrapper[4744]: I0930 04:25:40.466968 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c1131b4e-532d-478b-bbd8-b52963f60462/galera/0.log" Sep 30 04:25:40 crc kubenswrapper[4744]: I0930 04:25:40.657601 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ddf3db46-b4d2-469a-bc2e-dc5610bb2807/mysql-bootstrap/0.log" Sep 30 04:25:40 crc kubenswrapper[4744]: I0930 04:25:40.870909 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ddf3db46-b4d2-469a-bc2e-dc5610bb2807/mysql-bootstrap/0.log" Sep 30 04:25:40 crc kubenswrapper[4744]: I0930 04:25:40.897983 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ddf3db46-b4d2-469a-bc2e-dc5610bb2807/galera/0.log" Sep 30 04:25:41 crc kubenswrapper[4744]: I0930 04:25:41.052963 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_952fb37a-2fb5-41f5-a9f6-195c94862274/openstackclient/0.log" Sep 30 04:25:41 crc kubenswrapper[4744]: I0930 04:25:41.291479 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m95jr_6aa7757e-eced-4195-8b1d-88fd7a3b322d/ovn-controller/0.log" Sep 30 04:25:41 crc kubenswrapper[4744]: I0930 04:25:41.518693 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-227s2_dfc48401-bc82-4227-a5f2-22b7b5699433/openstack-network-exporter/0.log" Sep 30 04:25:41 crc kubenswrapper[4744]: I0930 04:25:41.575630 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d9aebe30-d132-461b-ad9b-fa6bc9f1227b/nova-metadata-metadata/0.log" Sep 30 04:25:41 crc kubenswrapper[4744]: I0930 04:25:41.717874 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovsdb-server-init/0.log" Sep 30 04:25:41 crc kubenswrapper[4744]: I0930 04:25:41.900561 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovs-vswitchd/0.log" Sep 30 04:25:41 crc kubenswrapper[4744]: I0930 04:25:41.914718 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovsdb-server-init/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.024653 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-t9l7c_f8f29c8c-e61d-4ec5-8a7c-c3c9079bb064/ovsdb-server/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.096650 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-vtk6h_4f7d5328-af2c-4ce3-8b2a-0fa4a6a94225/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.314889 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_09a4d14b-16a0-442c-8444-af404618ae96/openstack-network-exporter/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.515205 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_09a4d14b-16a0-442c-8444-af404618ae96/ovn-northd/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.553191 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7bfc1c21-6422-4308-8370-2dd0b26a3c1e/openstack-network-exporter/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.689280 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7bfc1c21-6422-4308-8370-2dd0b26a3c1e/ovsdbserver-nb/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.745599 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e0e55f0-f333-4bc6-9905-18adf601fb9c/openstack-network-exporter/0.log" Sep 30 04:25:42 crc kubenswrapper[4744]: I0930 04:25:42.878607 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1e0e55f0-f333-4bc6-9905-18adf601fb9c/ovsdbserver-sb/0.log" Sep 30 04:25:43 crc kubenswrapper[4744]: I0930 04:25:43.281979 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ddc58d856-kwfp8_825a9fa1-9368-48f2-9baa-1b8390d0cd3a/placement-api/0.log" Sep 30 04:25:43 crc kubenswrapper[4744]: I0930 04:25:43.419002 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_341a2cff-5aae-4952-a8d8-64d5e247d7f9/setup-container/0.log" Sep 30 04:25:43 crc kubenswrapper[4744]: I0930 04:25:43.579703 4744 scope.go:117] "RemoveContainer" containerID="223b746d6eaba590555501ad42b5de497c131a638771d65a4f7cad40406524e8" Sep 30 04:25:43 crc kubenswrapper[4744]: I0930 04:25:43.638508 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ddc58d856-kwfp8_825a9fa1-9368-48f2-9baa-1b8390d0cd3a/placement-log/0.log" Sep 30 04:25:43 crc kubenswrapper[4744]: I0930 04:25:43.741188 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_341a2cff-5aae-4952-a8d8-64d5e247d7f9/rabbitmq/0.log" Sep 30 04:25:43 crc kubenswrapper[4744]: I0930 04:25:43.747710 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_341a2cff-5aae-4952-a8d8-64d5e247d7f9/setup-container/0.log" Sep 30 04:25:43 crc kubenswrapper[4744]: I0930 04:25:43.974680 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7d180fc4-3fb0-4db5-99d7-913559d8ec2e/setup-container/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.173282 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7d180fc4-3fb0-4db5-99d7-913559d8ec2e/setup-container/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.178291 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7d180fc4-3fb0-4db5-99d7-913559d8ec2e/rabbitmq/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.371503 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9vdbn_3e4f8446-ac54-4cff-b7f3-025ced28cc74/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.409607 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-299d4_50f909f5-fbe0-489d-bb41-59a3318cd416/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.652106 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tbj8m_1b061989-0be6-4c0d-800f-05bedb5c9a90/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.853631 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-q66n7_13c93dcf-8343-45ef-a4cf-3f411d5311e1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.942689 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p4lhn_3aa13f38-7b2d-4f65-8ca1-0de736d1f291/ssh-known-hosts-edpm-deployment/0.log" Sep 30 04:25:44 crc kubenswrapper[4744]: I0930 04:25:44.995716 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0eb33cf6-e46d-4f10-b794-6707d21fc4ab/memcached/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.146980 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64cfcf86c-tq8s6_2a42e069-1859-4077-8f50-8b285465b47a/proxy-server/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.208849 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64cfcf86c-tq8s6_2a42e069-1859-4077-8f50-8b285465b47a/proxy-httpd/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.368285 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qcpwz_35a9b94f-3d1b-40a3-9bcf-279d796e86d9/swift-ring-rebalance/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.378359 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-auditor/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.549944 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-reaper/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.686195 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-server/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.718396 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/account-replicator/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.734660 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-auditor/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.761919 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-replicator/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.855802 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-server/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.879359 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/container-updater/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.926435 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-auditor/0.log" Sep 30 04:25:45 crc kubenswrapper[4744]: I0930 04:25:45.968696 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-expirer/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.064399 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-replicator/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.077917 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-server/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.142806 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/rsync/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.156063 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/object-updater/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.271839 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6391d7af-84fd-42ee-ac86-399fa13725de/swift-recon-cron/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.386403 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-rkxts_cb2e96d3-3ed8-4de1-bb0c-9a0c11df2f0c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.495391 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f4a78f7a-b5bc-4636-81df-578f5105bce3/tempest-tests-tempest-tests-runner/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.582897 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6fff3c05-b002-4910-8909-665295c5d940/test-operator-logs-container/0.log" Sep 30 04:25:46 crc kubenswrapper[4744]: I0930 04:25:46.673879 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-rdrfb_287b3de6-0593-428e-80d8-b70b360a7d41/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 04:26:12 crc kubenswrapper[4744]: I0930 04:26:12.083010 4744 generic.go:334] "Generic (PLEG): container finished" podID="209bc69c-4023-41c6-8c8e-f9582de3cc0a" containerID="33b3188c662cd94b82dd12358069eb704a3929b44b14e33a12093024f3de486d" exitCode=0 Sep 30 04:26:12 crc kubenswrapper[4744]: I0930 04:26:12.083062 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" event={"ID":"209bc69c-4023-41c6-8c8e-f9582de3cc0a","Type":"ContainerDied","Data":"33b3188c662cd94b82dd12358069eb704a3929b44b14e33a12093024f3de486d"} Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.242859 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.294077 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l8km5/crc-debug-hxhhr"] Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.304006 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l8km5/crc-debug-hxhhr"] Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.433393 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/209bc69c-4023-41c6-8c8e-f9582de3cc0a-host\") pod \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.433565 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/209bc69c-4023-41c6-8c8e-f9582de3cc0a-host" (OuterVolumeSpecName: "host") pod "209bc69c-4023-41c6-8c8e-f9582de3cc0a" (UID: "209bc69c-4023-41c6-8c8e-f9582de3cc0a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.433946 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zljxd\" (UniqueName: \"kubernetes.io/projected/209bc69c-4023-41c6-8c8e-f9582de3cc0a-kube-api-access-zljxd\") pod \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\" (UID: \"209bc69c-4023-41c6-8c8e-f9582de3cc0a\") " Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.434671 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/209bc69c-4023-41c6-8c8e-f9582de3cc0a-host\") on node \"crc\" DevicePath \"\"" Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.456187 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209bc69c-4023-41c6-8c8e-f9582de3cc0a-kube-api-access-zljxd" (OuterVolumeSpecName: "kube-api-access-zljxd") pod "209bc69c-4023-41c6-8c8e-f9582de3cc0a" (UID: "209bc69c-4023-41c6-8c8e-f9582de3cc0a"). InnerVolumeSpecName "kube-api-access-zljxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.522800 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209bc69c-4023-41c6-8c8e-f9582de3cc0a" path="/var/lib/kubelet/pods/209bc69c-4023-41c6-8c8e-f9582de3cc0a/volumes" Sep 30 04:26:13 crc kubenswrapper[4744]: I0930 04:26:13.536796 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zljxd\" (UniqueName: \"kubernetes.io/projected/209bc69c-4023-41c6-8c8e-f9582de3cc0a-kube-api-access-zljxd\") on node \"crc\" DevicePath \"\"" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.116074 4744 scope.go:117] "RemoveContainer" containerID="33b3188c662cd94b82dd12358069eb704a3929b44b14e33a12093024f3de486d" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.116115 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-hxhhr" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.499228 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l8km5/crc-debug-xntgt"] Sep 30 04:26:14 crc kubenswrapper[4744]: E0930 04:26:14.499843 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209bc69c-4023-41c6-8c8e-f9582de3cc0a" containerName="container-00" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.500169 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="209bc69c-4023-41c6-8c8e-f9582de3cc0a" containerName="container-00" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.500488 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="209bc69c-4023-41c6-8c8e-f9582de3cc0a" containerName="container-00" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.501345 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.668785 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56m2\" (UniqueName: \"kubernetes.io/projected/3d18ee49-1766-4f67-a0e1-e7b20da680ea-kube-api-access-w56m2\") pod \"crc-debug-xntgt\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.669020 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d18ee49-1766-4f67-a0e1-e7b20da680ea-host\") pod \"crc-debug-xntgt\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.771111 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d18ee49-1766-4f67-a0e1-e7b20da680ea-host\") pod \"crc-debug-xntgt\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.771265 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d18ee49-1766-4f67-a0e1-e7b20da680ea-host\") pod \"crc-debug-xntgt\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.771583 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56m2\" (UniqueName: \"kubernetes.io/projected/3d18ee49-1766-4f67-a0e1-e7b20da680ea-kube-api-access-w56m2\") pod \"crc-debug-xntgt\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.806197 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56m2\" (UniqueName: \"kubernetes.io/projected/3d18ee49-1766-4f67-a0e1-e7b20da680ea-kube-api-access-w56m2\") pod \"crc-debug-xntgt\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: I0930 04:26:14.822087 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:14 crc kubenswrapper[4744]: W0930 04:26:14.885554 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d18ee49_1766_4f67_a0e1_e7b20da680ea.slice/crio-0d34777eb62bfce2899ab0ccf6d885fbb8fc2433bc43918f03c4fb4b4c78f4ef WatchSource:0}: Error finding container 0d34777eb62bfce2899ab0ccf6d885fbb8fc2433bc43918f03c4fb4b4c78f4ef: Status 404 returned error can't find the container with id 0d34777eb62bfce2899ab0ccf6d885fbb8fc2433bc43918f03c4fb4b4c78f4ef Sep 30 04:26:15 crc kubenswrapper[4744]: I0930 04:26:15.132441 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-xntgt" event={"ID":"3d18ee49-1766-4f67-a0e1-e7b20da680ea","Type":"ContainerStarted","Data":"0d34777eb62bfce2899ab0ccf6d885fbb8fc2433bc43918f03c4fb4b4c78f4ef"} Sep 30 04:26:16 crc kubenswrapper[4744]: I0930 04:26:16.145674 4744 generic.go:334] "Generic (PLEG): container finished" podID="3d18ee49-1766-4f67-a0e1-e7b20da680ea" containerID="db0e27cf2a25bc778af6f1b1057dfe9b82e3fafb7d3b2af4f61039cdf9e8526d" exitCode=0 Sep 30 04:26:16 crc kubenswrapper[4744]: I0930 04:26:16.145716 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-xntgt" event={"ID":"3d18ee49-1766-4f67-a0e1-e7b20da680ea","Type":"ContainerDied","Data":"db0e27cf2a25bc778af6f1b1057dfe9b82e3fafb7d3b2af4f61039cdf9e8526d"} Sep 30 04:26:17 crc kubenswrapper[4744]: I0930 04:26:17.266890 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:17 crc kubenswrapper[4744]: I0930 04:26:17.426045 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w56m2\" (UniqueName: \"kubernetes.io/projected/3d18ee49-1766-4f67-a0e1-e7b20da680ea-kube-api-access-w56m2\") pod \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " Sep 30 04:26:17 crc kubenswrapper[4744]: I0930 04:26:17.426380 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d18ee49-1766-4f67-a0e1-e7b20da680ea-host\") pod \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\" (UID: \"3d18ee49-1766-4f67-a0e1-e7b20da680ea\") " Sep 30 04:26:17 crc kubenswrapper[4744]: I0930 04:26:17.426851 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d18ee49-1766-4f67-a0e1-e7b20da680ea-host" (OuterVolumeSpecName: "host") pod "3d18ee49-1766-4f67-a0e1-e7b20da680ea" (UID: "3d18ee49-1766-4f67-a0e1-e7b20da680ea"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 04:26:17 crc kubenswrapper[4744]: I0930 04:26:17.446663 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d18ee49-1766-4f67-a0e1-e7b20da680ea-kube-api-access-w56m2" (OuterVolumeSpecName: "kube-api-access-w56m2") pod "3d18ee49-1766-4f67-a0e1-e7b20da680ea" (UID: "3d18ee49-1766-4f67-a0e1-e7b20da680ea"). InnerVolumeSpecName "kube-api-access-w56m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:26:17 crc kubenswrapper[4744]: I0930 04:26:17.528259 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d18ee49-1766-4f67-a0e1-e7b20da680ea-host\") on node \"crc\" DevicePath \"\"" Sep 30 04:26:17 crc kubenswrapper[4744]: I0930 04:26:17.528393 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w56m2\" (UniqueName: \"kubernetes.io/projected/3d18ee49-1766-4f67-a0e1-e7b20da680ea-kube-api-access-w56m2\") on node \"crc\" DevicePath \"\"" Sep 30 04:26:18 crc kubenswrapper[4744]: I0930 04:26:18.163879 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-xntgt" Sep 30 04:26:18 crc kubenswrapper[4744]: I0930 04:26:18.163826 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-xntgt" event={"ID":"3d18ee49-1766-4f67-a0e1-e7b20da680ea","Type":"ContainerDied","Data":"0d34777eb62bfce2899ab0ccf6d885fbb8fc2433bc43918f03c4fb4b4c78f4ef"} Sep 30 04:26:18 crc kubenswrapper[4744]: I0930 04:26:18.164253 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d34777eb62bfce2899ab0ccf6d885fbb8fc2433bc43918f03c4fb4b4c78f4ef" Sep 30 04:26:25 crc kubenswrapper[4744]: I0930 04:26:25.342426 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l8km5/crc-debug-xntgt"] Sep 30 04:26:25 crc kubenswrapper[4744]: I0930 04:26:25.348553 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l8km5/crc-debug-xntgt"] Sep 30 04:26:25 crc kubenswrapper[4744]: I0930 04:26:25.516656 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d18ee49-1766-4f67-a0e1-e7b20da680ea" path="/var/lib/kubelet/pods/3d18ee49-1766-4f67-a0e1-e7b20da680ea/volumes" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.565655 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l8km5/crc-debug-kt44l"] Sep 30 04:26:26 crc kubenswrapper[4744]: E0930 04:26:26.566287 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d18ee49-1766-4f67-a0e1-e7b20da680ea" containerName="container-00" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.566300 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d18ee49-1766-4f67-a0e1-e7b20da680ea" containerName="container-00" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.566527 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d18ee49-1766-4f67-a0e1-e7b20da680ea" containerName="container-00" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.567181 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.689645 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkkrn\" (UniqueName: \"kubernetes.io/projected/9bb90cec-fcf9-49ed-8229-7e263d421a33-kube-api-access-zkkrn\") pod \"crc-debug-kt44l\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.689725 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bb90cec-fcf9-49ed-8229-7e263d421a33-host\") pod \"crc-debug-kt44l\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.791583 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkkrn\" (UniqueName: \"kubernetes.io/projected/9bb90cec-fcf9-49ed-8229-7e263d421a33-kube-api-access-zkkrn\") pod \"crc-debug-kt44l\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.791995 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bb90cec-fcf9-49ed-8229-7e263d421a33-host\") pod \"crc-debug-kt44l\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.792327 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bb90cec-fcf9-49ed-8229-7e263d421a33-host\") pod \"crc-debug-kt44l\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.822419 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkkrn\" (UniqueName: \"kubernetes.io/projected/9bb90cec-fcf9-49ed-8229-7e263d421a33-kube-api-access-zkkrn\") pod \"crc-debug-kt44l\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:26 crc kubenswrapper[4744]: I0930 04:26:26.902956 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:27 crc kubenswrapper[4744]: I0930 04:26:27.252729 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-kt44l" event={"ID":"9bb90cec-fcf9-49ed-8229-7e263d421a33","Type":"ContainerStarted","Data":"9f30ddf658f8b180a94645caab792bcbf85b6c2a84e4e0761f9815d64343f724"} Sep 30 04:26:27 crc kubenswrapper[4744]: I0930 04:26:27.253193 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-kt44l" event={"ID":"9bb90cec-fcf9-49ed-8229-7e263d421a33","Type":"ContainerStarted","Data":"70ba6d1e4c76c4574b26b4d1a3ee656dcc0da816bc9b24a7fd5358aa6d24424f"} Sep 30 04:26:28 crc kubenswrapper[4744]: I0930 04:26:28.263850 4744 generic.go:334] "Generic (PLEG): container finished" podID="9bb90cec-fcf9-49ed-8229-7e263d421a33" containerID="9f30ddf658f8b180a94645caab792bcbf85b6c2a84e4e0761f9815d64343f724" exitCode=0 Sep 30 04:26:28 crc kubenswrapper[4744]: I0930 04:26:28.263906 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/crc-debug-kt44l" event={"ID":"9bb90cec-fcf9-49ed-8229-7e263d421a33","Type":"ContainerDied","Data":"9f30ddf658f8b180a94645caab792bcbf85b6c2a84e4e0761f9815d64343f724"} Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.377289 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.419570 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l8km5/crc-debug-kt44l"] Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.430623 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l8km5/crc-debug-kt44l"] Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.455338 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bb90cec-fcf9-49ed-8229-7e263d421a33-host\") pod \"9bb90cec-fcf9-49ed-8229-7e263d421a33\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.455486 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bb90cec-fcf9-49ed-8229-7e263d421a33-host" (OuterVolumeSpecName: "host") pod "9bb90cec-fcf9-49ed-8229-7e263d421a33" (UID: "9bb90cec-fcf9-49ed-8229-7e263d421a33"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.455732 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkkrn\" (UniqueName: \"kubernetes.io/projected/9bb90cec-fcf9-49ed-8229-7e263d421a33-kube-api-access-zkkrn\") pod \"9bb90cec-fcf9-49ed-8229-7e263d421a33\" (UID: \"9bb90cec-fcf9-49ed-8229-7e263d421a33\") " Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.456827 4744 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9bb90cec-fcf9-49ed-8229-7e263d421a33-host\") on node \"crc\" DevicePath \"\"" Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.463122 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb90cec-fcf9-49ed-8229-7e263d421a33-kube-api-access-zkkrn" (OuterVolumeSpecName: "kube-api-access-zkkrn") pod "9bb90cec-fcf9-49ed-8229-7e263d421a33" (UID: "9bb90cec-fcf9-49ed-8229-7e263d421a33"). InnerVolumeSpecName "kube-api-access-zkkrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.522336 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb90cec-fcf9-49ed-8229-7e263d421a33" path="/var/lib/kubelet/pods/9bb90cec-fcf9-49ed-8229-7e263d421a33/volumes" Sep 30 04:26:29 crc kubenswrapper[4744]: I0930 04:26:29.559406 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkkrn\" (UniqueName: \"kubernetes.io/projected/9bb90cec-fcf9-49ed-8229-7e263d421a33-kube-api-access-zkkrn\") on node \"crc\" DevicePath \"\"" Sep 30 04:26:30 crc kubenswrapper[4744]: I0930 04:26:30.283231 4744 scope.go:117] "RemoveContainer" containerID="9f30ddf658f8b180a94645caab792bcbf85b6c2a84e4e0761f9815d64343f724" Sep 30 04:26:30 crc kubenswrapper[4744]: I0930 04:26:30.283262 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/crc-debug-kt44l" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.358264 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nh8lm_464a78d3-19ea-4024-95f8-65c384a11de5/manager/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.368468 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-nh8lm_464a78d3-19ea-4024-95f8-65c384a11de5/kube-rbac-proxy/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.512088 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lfgcl_48c24f9c-7ad2-4b16-8586-a98cc6f5745d/kube-rbac-proxy/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.594137 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lfgcl_48c24f9c-7ad2-4b16-8586-a98cc6f5745d/manager/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.671888 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-r4pn4_2754694b-4135-4439-ae89-dd08166467a5/manager/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.718917 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-r4pn4_2754694b-4135-4439-ae89-dd08166467a5/kube-rbac-proxy/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.770019 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/util/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.949826 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/util/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.977479 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/pull/0.log" Sep 30 04:26:31 crc kubenswrapper[4744]: I0930 04:26:31.986970 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/pull/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.203620 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/extract/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.211610 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/util/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.239889 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fcb715ece9f005e1a03c130eeea8b9b209953c0686aefe28df3e5ad2ff6zrfq_9ff2e9a1-f1f5-48ca-b764-61b87af9c7ef/pull/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.376678 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-f5wr4_7c60f5e8-9ac8-4729-9030-a17a74c66872/kube-rbac-proxy/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.462591 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-f5wr4_7c60f5e8-9ac8-4729-9030-a17a74c66872/manager/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.472205 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-jrs8k_c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad/kube-rbac-proxy/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.601197 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-jrs8k_c1a11cfd-aa9b-4aaf-9d4f-59b7308620ad/manager/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.648540 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-jrmql_a6acae25-c5f3-4719-9a0d-866cef31aae8/kube-rbac-proxy/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.693818 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-jrmql_a6acae25-c5f3-4719-9a0d-866cef31aae8/manager/0.log" Sep 30 04:26:32 crc kubenswrapper[4744]: I0930 04:26:32.887939 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-z8f6l_a21e2f23-2adc-4f24-be18-72c39bb6ac8e/kube-rbac-proxy/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.024434 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-z8f6l_a21e2f23-2adc-4f24-be18-72c39bb6ac8e/manager/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.051439 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-plwv5_1416686e-3057-4219-93e8-b6ed99e1b000/kube-rbac-proxy/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.087992 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-plwv5_1416686e-3057-4219-93e8-b6ed99e1b000/manager/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.213535 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-n4bm8_befb38ef-208d-435f-820a-787301b3c4b8/kube-rbac-proxy/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.326919 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-n4bm8_befb38ef-208d-435f-820a-787301b3c4b8/manager/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.334738 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-xc7mx_edff3052-2bfd-47d9-be42-5d8f608fc529/kube-rbac-proxy/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.477788 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-xc7mx_edff3052-2bfd-47d9-be42-5d8f608fc529/manager/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.517116 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5whrj_7955a9b4-f81b-45cd-bc57-b96bef24b064/kube-rbac-proxy/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.595940 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5whrj_7955a9b4-f81b-45cd-bc57-b96bef24b064/manager/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.770105 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-hl4qt_adaa00a7-7a31-40ae-975e-47306e8128e8/kube-rbac-proxy/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.825581 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-hl4qt_adaa00a7-7a31-40ae-975e-47306e8128e8/manager/0.log" Sep 30 04:26:33 crc kubenswrapper[4744]: I0930 04:26:33.924462 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hjmhz_be907fa2-e5ce-461e-bad7-7ff67b7b28fc/kube-rbac-proxy/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.029189 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-hjmhz_be907fa2-e5ce-461e-bad7-7ff67b7b28fc/manager/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.053800 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-zvxch_bcc255bc-d09d-4f16-b541-4e206fb39a80/kube-rbac-proxy/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.132849 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-zvxch_bcc255bc-d09d-4f16-b541-4e206fb39a80/manager/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.234191 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-gmkqb_2fdc94bc-95cf-4a16-a6cc-0d277f4969bc/kube-rbac-proxy/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.307751 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-gmkqb_2fdc94bc-95cf-4a16-a6cc-0d277f4969bc/manager/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.365612 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ff84bd547-bw5gx_1dc02df0-0d0c-49bc-b3e8-69efc93c3167/kube-rbac-proxy/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.512277 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d55cf86f4-4xvw5_fec734bd-0bd4-4e73-9d2d-cd6f0f002577/kube-rbac-proxy/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.785446 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d55cf86f4-4xvw5_fec734bd-0bd4-4e73-9d2d-cd6f0f002577/operator/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.844783 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2drwl_f429d57e-28b9-4f82-bb1f-494d295492d1/registry-server/0.log" Sep 30 04:26:34 crc kubenswrapper[4744]: I0930 04:26:34.993327 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-kd8v7_5c70f54b-6405-4dcc-a2d2-e989b2516f0e/kube-rbac-proxy/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.167656 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-kd8v7_5c70f54b-6405-4dcc-a2d2-e989b2516f0e/manager/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.242200 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-pcn79_015eb732-be5e-404f-81e2-b43d012c356b/kube-rbac-proxy/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.302624 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-pcn79_015eb732-be5e-404f-81e2-b43d012c356b/manager/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.468086 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ff84bd547-bw5gx_1dc02df0-0d0c-49bc-b3e8-69efc93c3167/manager/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.468232 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-2vpmm_bb137b23-9366-4d2c-bc9d-ec50ccaef237/operator/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.488568 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-g6w7n_c46f8a8d-f07f-4983-9971-6b06d47c8e38/kube-rbac-proxy/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.600129 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-g6w7n_c46f8a8d-f07f-4983-9971-6b06d47c8e38/manager/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.634397 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5fx9s_5ae4d03d-68a5-498a-992f-df43dbeebc73/kube-rbac-proxy/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.759310 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-5fx9s_5ae4d03d-68a5-498a-992f-df43dbeebc73/manager/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.788627 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-p5zk9_5502e6e0-3d2f-479c-a53f-005bbb749631/kube-rbac-proxy/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.820238 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-p5zk9_5502e6e0-3d2f-479c-a53f-005bbb749631/manager/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.947931 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-pvzfj_6293cdef-44a9-4639-a40d-df02e9aa8410/manager/0.log" Sep 30 04:26:35 crc kubenswrapper[4744]: I0930 04:26:35.975880 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-pvzfj_6293cdef-44a9-4639-a40d-df02e9aa8410/kube-rbac-proxy/0.log" Sep 30 04:26:53 crc kubenswrapper[4744]: I0930 04:26:53.135519 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ssdfc_6c6b1765-0a44-41b0-9f4c-d0e1cb8f434e/control-plane-machine-set-operator/0.log" Sep 30 04:26:53 crc kubenswrapper[4744]: I0930 04:26:53.326462 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxxmx_e771fd9b-4d78-4117-ac7c-40595fa5eb0b/machine-api-operator/0.log" Sep 30 04:26:53 crc kubenswrapper[4744]: I0930 04:26:53.329318 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jxxmx_e771fd9b-4d78-4117-ac7c-40595fa5eb0b/kube-rbac-proxy/0.log" Sep 30 04:27:06 crc kubenswrapper[4744]: I0930 04:27:06.170073 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-tcddl_c692f12b-868a-4985-8c61-529463a4bbf5/cert-manager-controller/0.log" Sep 30 04:27:06 crc kubenswrapper[4744]: I0930 04:27:06.307421 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-vhcl2_fa738a2a-d979-4352-82d3-ed7eb89e8fd9/cert-manager-cainjector/0.log" Sep 30 04:27:06 crc kubenswrapper[4744]: I0930 04:27:06.364601 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-fc4dx_05f345c2-2e42-4cf0-85f6-6a40551d51d7/cert-manager-webhook/0.log" Sep 30 04:27:19 crc kubenswrapper[4744]: I0930 04:27:19.359529 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-kg924_c58a40af-7fd8-4a82-8109-855fbb1c32f3/nmstate-console-plugin/0.log" Sep 30 04:27:19 crc kubenswrapper[4744]: I0930 04:27:19.577816 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-57mt4_1b5d08ff-f7f3-4f08-a8be-dd45390037e4/nmstate-handler/0.log" Sep 30 04:27:19 crc kubenswrapper[4744]: I0930 04:27:19.592434 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tt4lr_ca163139-502b-44cc-ae53-83bc49866259/kube-rbac-proxy/0.log" Sep 30 04:27:19 crc kubenswrapper[4744]: I0930 04:27:19.658989 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tt4lr_ca163139-502b-44cc-ae53-83bc49866259/nmstate-metrics/0.log" Sep 30 04:27:19 crc kubenswrapper[4744]: I0930 04:27:19.763100 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-8lx9f_1b8b0be2-d1e6-44b8-b5ba-af3ee8cfed0c/nmstate-operator/0.log" Sep 30 04:27:19 crc kubenswrapper[4744]: I0930 04:27:19.854488 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-lclrw_37a9886e-c2c0-46ab-a260-57231999e956/nmstate-webhook/0.log" Sep 30 04:27:34 crc kubenswrapper[4744]: I0930 04:27:34.347832 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:27:34 crc kubenswrapper[4744]: I0930 04:27:34.348635 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:27:34 crc kubenswrapper[4744]: I0930 04:27:34.851558 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-jg7r7_15204744-c1cf-4027-8131-fd89b0544638/kube-rbac-proxy/0.log" Sep 30 04:27:34 crc kubenswrapper[4744]: I0930 04:27:34.962042 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-jg7r7_15204744-c1cf-4027-8131-fd89b0544638/controller/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.029475 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.264731 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.267962 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.300460 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.320560 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.467038 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.484493 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.501915 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.534470 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.678221 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-frr-files/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.699739 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-metrics/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.721518 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/cp-reloader/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.731522 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/controller/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.871884 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/frr-metrics/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.931323 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/kube-rbac-proxy-frr/0.log" Sep 30 04:27:35 crc kubenswrapper[4744]: I0930 04:27:35.936973 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/kube-rbac-proxy/0.log" Sep 30 04:27:36 crc kubenswrapper[4744]: I0930 04:27:36.032414 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/reloader/0.log" Sep 30 04:27:36 crc kubenswrapper[4744]: I0930 04:27:36.108133 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-bwg9t_416e2fa8-29ae-42c2-a71a-863244e1b5df/frr-k8s-webhook-server/0.log" Sep 30 04:27:36 crc kubenswrapper[4744]: I0930 04:27:36.298437 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58d4cc4478-pt5sc_3d8f001e-888e-4f58-b3c6-6b8b0fddaf3e/manager/0.log" Sep 30 04:27:36 crc kubenswrapper[4744]: I0930 04:27:36.429361 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-67f6c5dc78-c7h6t_cc149279-3823-4588-a559-a348efdb9bcd/webhook-server/0.log" Sep 30 04:27:36 crc kubenswrapper[4744]: I0930 04:27:36.575380 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dthlc_5ae230cf-d8e3-49d5-a336-fd028e0f5303/kube-rbac-proxy/0.log" Sep 30 04:27:37 crc kubenswrapper[4744]: I0930 04:27:37.104567 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dthlc_5ae230cf-d8e3-49d5-a336-fd028e0f5303/speaker/0.log" Sep 30 04:27:37 crc kubenswrapper[4744]: I0930 04:27:37.375148 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p7zk7_c5228111-8563-4e96-abae-b748a4677ff8/frr/0.log" Sep 30 04:27:50 crc kubenswrapper[4744]: I0930 04:27:50.555989 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/util/0.log" Sep 30 04:27:50 crc kubenswrapper[4744]: I0930 04:27:50.692724 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/util/0.log" Sep 30 04:27:50 crc kubenswrapper[4744]: I0930 04:27:50.750712 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/pull/0.log" Sep 30 04:27:50 crc kubenswrapper[4744]: I0930 04:27:50.753266 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/pull/0.log" Sep 30 04:27:50 crc kubenswrapper[4744]: I0930 04:27:50.889077 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/util/0.log" Sep 30 04:27:50 crc kubenswrapper[4744]: I0930 04:27:50.894174 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/pull/0.log" Sep 30 04:27:50 crc kubenswrapper[4744]: I0930 04:27:50.934346 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcgfvcl_9146dff8-2558-4464-8f88-5700a10d2ab3/extract/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.049292 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-utilities/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.236485 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-content/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.255757 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-content/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.258074 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-utilities/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.405944 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-utilities/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.406902 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/extract-content/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.591395 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-utilities/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.807958 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-utilities/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.842434 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-content/0.log" Sep 30 04:27:51 crc kubenswrapper[4744]: I0930 04:27:51.896830 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-content/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.061382 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxk7c_0bfa3a3d-a0d3-4b2e-8ca0-2450a2b6ddc6/registry-server/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.077330 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-content/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.120153 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/extract-utilities/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.240265 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/util/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.464921 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/pull/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.513842 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/pull/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.529931 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/util/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.718740 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/util/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.725989 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/pull/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.749000 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96pmjc4_9f265019-d7fa-4768-95f1-aeefab156c9c/extract/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.871261 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ql7fl_a70de5bd-856c-42de-a059-e533218cf02b/registry-server/0.log" Sep 30 04:27:52 crc kubenswrapper[4744]: I0930 04:27:52.943333 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wkhkg_fe8983ae-8985-4ff0-8fbe-8ab1b8bb4280/marketplace-operator/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.064722 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-utilities/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.219979 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-utilities/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.240290 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-content/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.240815 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-content/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.377883 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-content/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.382491 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/extract-utilities/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.544095 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-utilities/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.554377 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ts2v8_23313229-2e34-4cf4-988e-e273962bec95/registry-server/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.765457 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-utilities/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.814661 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-content/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.855858 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-content/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.987452 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-content/0.log" Sep 30 04:27:53 crc kubenswrapper[4744]: I0930 04:27:53.997293 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/extract-utilities/0.log" Sep 30 04:27:54 crc kubenswrapper[4744]: I0930 04:27:54.656300 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zpn7m_f06dd885-034b-4e39-bb3b-689087c8a26c/registry-server/0.log" Sep 30 04:28:04 crc kubenswrapper[4744]: I0930 04:28:04.347694 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:28:04 crc kubenswrapper[4744]: I0930 04:28:04.350505 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.348663 4744 patch_prober.go:28] interesting pod/machine-config-daemon-kp8zv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.349413 4744 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.349495 4744 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.350470 4744 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15"} pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.350578 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerName="machine-config-daemon" containerID="cri-o://3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" gracePeriod=600 Sep 30 04:28:34 crc kubenswrapper[4744]: E0930 04:28:34.483200 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.589924 4744 generic.go:334] "Generic (PLEG): container finished" podID="a0ffb258-115f-4a60-92da-91d4a9036c10" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" exitCode=0 Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.589987 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" event={"ID":"a0ffb258-115f-4a60-92da-91d4a9036c10","Type":"ContainerDied","Data":"3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15"} Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.590035 4744 scope.go:117] "RemoveContainer" containerID="daf39e736917ca0a1b7fe85b78046b281e6dcd02d5f095b3a5da30b062ada306" Sep 30 04:28:34 crc kubenswrapper[4744]: I0930 04:28:34.591353 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:28:34 crc kubenswrapper[4744]: E0930 04:28:34.592015 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:28:45 crc kubenswrapper[4744]: I0930 04:28:45.506415 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:28:45 crc kubenswrapper[4744]: E0930 04:28:45.507470 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.354255 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z929j"] Sep 30 04:28:54 crc kubenswrapper[4744]: E0930 04:28:54.355639 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb90cec-fcf9-49ed-8229-7e263d421a33" containerName="container-00" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.355665 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb90cec-fcf9-49ed-8229-7e263d421a33" containerName="container-00" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.355998 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb90cec-fcf9-49ed-8229-7e263d421a33" containerName="container-00" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.359473 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.370203 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z929j"] Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.504334 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-catalog-content\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.504412 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-utilities\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.504764 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdr4q\" (UniqueName: \"kubernetes.io/projected/16994141-0df7-4e89-92b7-cb36e9819e6a-kube-api-access-xdr4q\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.607088 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdr4q\" (UniqueName: \"kubernetes.io/projected/16994141-0df7-4e89-92b7-cb36e9819e6a-kube-api-access-xdr4q\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.607203 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-catalog-content\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.607248 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-utilities\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.607772 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-catalog-content\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.607912 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-utilities\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.632344 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdr4q\" (UniqueName: \"kubernetes.io/projected/16994141-0df7-4e89-92b7-cb36e9819e6a-kube-api-access-xdr4q\") pod \"redhat-marketplace-z929j\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.702955 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.948105 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xqt8s"] Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.951322 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:54 crc kubenswrapper[4744]: I0930 04:28:54.960024 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqt8s"] Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.116656 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-utilities\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.116707 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-catalog-content\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.116727 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kbr5\" (UniqueName: \"kubernetes.io/projected/6f052070-73e1-409d-be1c-27401060b45f-kube-api-access-5kbr5\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.201020 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z929j"] Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.221535 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-utilities\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.221583 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-catalog-content\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.221603 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kbr5\" (UniqueName: \"kubernetes.io/projected/6f052070-73e1-409d-be1c-27401060b45f-kube-api-access-5kbr5\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.222228 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-catalog-content\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.222398 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-utilities\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.241135 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kbr5\" (UniqueName: \"kubernetes.io/projected/6f052070-73e1-409d-be1c-27401060b45f-kube-api-access-5kbr5\") pod \"redhat-operators-xqt8s\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.276327 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.694455 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqt8s"] Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.867995 4744 generic.go:334] "Generic (PLEG): container finished" podID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerID="47d5942281aef5e02f93c2edfd4ab38232bcfd4a49577427fbd36f5b54e6bcab" exitCode=0 Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.868056 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z929j" event={"ID":"16994141-0df7-4e89-92b7-cb36e9819e6a","Type":"ContainerDied","Data":"47d5942281aef5e02f93c2edfd4ab38232bcfd4a49577427fbd36f5b54e6bcab"} Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.868085 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z929j" event={"ID":"16994141-0df7-4e89-92b7-cb36e9819e6a","Type":"ContainerStarted","Data":"442e75d2d76a6b4b087c029b7613b21b261207d7c2a74399a7f3badb4518a1a4"} Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.869338 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqt8s" event={"ID":"6f052070-73e1-409d-be1c-27401060b45f","Type":"ContainerStarted","Data":"d866c5e4bd192e080fad7df3d9925687118bd941e39ece199cdde3fdd67bde31"} Sep 30 04:28:55 crc kubenswrapper[4744]: I0930 04:28:55.870150 4744 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 04:28:56 crc kubenswrapper[4744]: I0930 04:28:56.882551 4744 generic.go:334] "Generic (PLEG): container finished" podID="6f052070-73e1-409d-be1c-27401060b45f" containerID="b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788" exitCode=0 Sep 30 04:28:56 crc kubenswrapper[4744]: I0930 04:28:56.882601 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqt8s" event={"ID":"6f052070-73e1-409d-be1c-27401060b45f","Type":"ContainerDied","Data":"b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788"} Sep 30 04:28:57 crc kubenswrapper[4744]: I0930 04:28:57.894200 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqt8s" event={"ID":"6f052070-73e1-409d-be1c-27401060b45f","Type":"ContainerStarted","Data":"52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e"} Sep 30 04:28:57 crc kubenswrapper[4744]: I0930 04:28:57.897329 4744 generic.go:334] "Generic (PLEG): container finished" podID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerID="83ac417052b41d0ffeda51d5e47cb54dee7ab762879609035084c8ddc8c5cda8" exitCode=0 Sep 30 04:28:57 crc kubenswrapper[4744]: I0930 04:28:57.897657 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z929j" event={"ID":"16994141-0df7-4e89-92b7-cb36e9819e6a","Type":"ContainerDied","Data":"83ac417052b41d0ffeda51d5e47cb54dee7ab762879609035084c8ddc8c5cda8"} Sep 30 04:28:58 crc kubenswrapper[4744]: I0930 04:28:58.916924 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z929j" event={"ID":"16994141-0df7-4e89-92b7-cb36e9819e6a","Type":"ContainerStarted","Data":"cccb40de2201316d8c4988f9c88749244d396dd13bd4184920ac3efca74c81e3"} Sep 30 04:28:58 crc kubenswrapper[4744]: I0930 04:28:58.944157 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z929j" podStartSLOduration=2.356549679 podStartE2EDuration="4.94413446s" podCreationTimestamp="2025-09-30 04:28:54 +0000 UTC" firstStartedPulling="2025-09-30 04:28:55.869748245 +0000 UTC m=+5663.042968219" lastFinishedPulling="2025-09-30 04:28:58.457333026 +0000 UTC m=+5665.630553000" observedRunningTime="2025-09-30 04:28:58.937308172 +0000 UTC m=+5666.110528156" watchObservedRunningTime="2025-09-30 04:28:58.94413446 +0000 UTC m=+5666.117354444" Sep 30 04:28:59 crc kubenswrapper[4744]: I0930 04:28:59.504056 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:28:59 crc kubenswrapper[4744]: E0930 04:28:59.504650 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:28:59 crc kubenswrapper[4744]: I0930 04:28:59.934987 4744 generic.go:334] "Generic (PLEG): container finished" podID="6f052070-73e1-409d-be1c-27401060b45f" containerID="52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e" exitCode=0 Sep 30 04:28:59 crc kubenswrapper[4744]: I0930 04:28:59.936396 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqt8s" event={"ID":"6f052070-73e1-409d-be1c-27401060b45f","Type":"ContainerDied","Data":"52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e"} Sep 30 04:29:00 crc kubenswrapper[4744]: I0930 04:29:00.960904 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqt8s" event={"ID":"6f052070-73e1-409d-be1c-27401060b45f","Type":"ContainerStarted","Data":"58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea"} Sep 30 04:29:01 crc kubenswrapper[4744]: I0930 04:29:01.004410 4744 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xqt8s" podStartSLOduration=3.547782254 podStartE2EDuration="7.004361277s" podCreationTimestamp="2025-09-30 04:28:54 +0000 UTC" firstStartedPulling="2025-09-30 04:28:56.88730665 +0000 UTC m=+5664.060526634" lastFinishedPulling="2025-09-30 04:29:00.343885653 +0000 UTC m=+5667.517105657" observedRunningTime="2025-09-30 04:29:00.994276776 +0000 UTC m=+5668.167496760" watchObservedRunningTime="2025-09-30 04:29:01.004361277 +0000 UTC m=+5668.177581261" Sep 30 04:29:04 crc kubenswrapper[4744]: I0930 04:29:04.703585 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:29:04 crc kubenswrapper[4744]: I0930 04:29:04.704178 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:29:04 crc kubenswrapper[4744]: I0930 04:29:04.768636 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:29:05 crc kubenswrapper[4744]: I0930 04:29:05.089079 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:29:05 crc kubenswrapper[4744]: I0930 04:29:05.277338 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:29:05 crc kubenswrapper[4744]: I0930 04:29:05.277709 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:29:05 crc kubenswrapper[4744]: I0930 04:29:05.944912 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z929j"] Sep 30 04:29:06 crc kubenswrapper[4744]: I0930 04:29:06.360352 4744 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xqt8s" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="registry-server" probeResult="failure" output=< Sep 30 04:29:06 crc kubenswrapper[4744]: timeout: failed to connect service ":50051" within 1s Sep 30 04:29:06 crc kubenswrapper[4744]: > Sep 30 04:29:07 crc kubenswrapper[4744]: I0930 04:29:07.024763 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z929j" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="registry-server" containerID="cri-o://cccb40de2201316d8c4988f9c88749244d396dd13bd4184920ac3efca74c81e3" gracePeriod=2 Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.037608 4744 generic.go:334] "Generic (PLEG): container finished" podID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerID="cccb40de2201316d8c4988f9c88749244d396dd13bd4184920ac3efca74c81e3" exitCode=0 Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.037693 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z929j" event={"ID":"16994141-0df7-4e89-92b7-cb36e9819e6a","Type":"ContainerDied","Data":"cccb40de2201316d8c4988f9c88749244d396dd13bd4184920ac3efca74c81e3"} Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.037950 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z929j" event={"ID":"16994141-0df7-4e89-92b7-cb36e9819e6a","Type":"ContainerDied","Data":"442e75d2d76a6b4b087c029b7613b21b261207d7c2a74399a7f3badb4518a1a4"} Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.037970 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="442e75d2d76a6b4b087c029b7613b21b261207d7c2a74399a7f3badb4518a1a4" Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.141360 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.233541 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-catalog-content\") pod \"16994141-0df7-4e89-92b7-cb36e9819e6a\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.233602 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-utilities\") pod \"16994141-0df7-4e89-92b7-cb36e9819e6a\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.233767 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdr4q\" (UniqueName: \"kubernetes.io/projected/16994141-0df7-4e89-92b7-cb36e9819e6a-kube-api-access-xdr4q\") pod \"16994141-0df7-4e89-92b7-cb36e9819e6a\" (UID: \"16994141-0df7-4e89-92b7-cb36e9819e6a\") " Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.234753 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-utilities" (OuterVolumeSpecName: "utilities") pod "16994141-0df7-4e89-92b7-cb36e9819e6a" (UID: "16994141-0df7-4e89-92b7-cb36e9819e6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.249660 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16994141-0df7-4e89-92b7-cb36e9819e6a" (UID: "16994141-0df7-4e89-92b7-cb36e9819e6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.250415 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16994141-0df7-4e89-92b7-cb36e9819e6a-kube-api-access-xdr4q" (OuterVolumeSpecName: "kube-api-access-xdr4q") pod "16994141-0df7-4e89-92b7-cb36e9819e6a" (UID: "16994141-0df7-4e89-92b7-cb36e9819e6a"). InnerVolumeSpecName "kube-api-access-xdr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.337064 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.337117 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16994141-0df7-4e89-92b7-cb36e9819e6a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:29:08 crc kubenswrapper[4744]: I0930 04:29:08.337141 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdr4q\" (UniqueName: \"kubernetes.io/projected/16994141-0df7-4e89-92b7-cb36e9819e6a-kube-api-access-xdr4q\") on node \"crc\" DevicePath \"\"" Sep 30 04:29:09 crc kubenswrapper[4744]: I0930 04:29:09.047216 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z929j" Sep 30 04:29:09 crc kubenswrapper[4744]: I0930 04:29:09.090646 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z929j"] Sep 30 04:29:09 crc kubenswrapper[4744]: I0930 04:29:09.103637 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z929j"] Sep 30 04:29:09 crc kubenswrapper[4744]: I0930 04:29:09.518618 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" path="/var/lib/kubelet/pods/16994141-0df7-4e89-92b7-cb36e9819e6a/volumes" Sep 30 04:29:11 crc kubenswrapper[4744]: I0930 04:29:11.504670 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:29:11 crc kubenswrapper[4744]: E0930 04:29:11.505423 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:29:15 crc kubenswrapper[4744]: I0930 04:29:15.392422 4744 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:29:15 crc kubenswrapper[4744]: I0930 04:29:15.486071 4744 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:29:15 crc kubenswrapper[4744]: I0930 04:29:15.656278 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqt8s"] Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.170990 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xqt8s" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="registry-server" containerID="cri-o://58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea" gracePeriod=2 Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.735534 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.851479 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-catalog-content\") pod \"6f052070-73e1-409d-be1c-27401060b45f\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.851572 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-utilities\") pod \"6f052070-73e1-409d-be1c-27401060b45f\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.851820 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kbr5\" (UniqueName: \"kubernetes.io/projected/6f052070-73e1-409d-be1c-27401060b45f-kube-api-access-5kbr5\") pod \"6f052070-73e1-409d-be1c-27401060b45f\" (UID: \"6f052070-73e1-409d-be1c-27401060b45f\") " Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.853505 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-utilities" (OuterVolumeSpecName: "utilities") pod "6f052070-73e1-409d-be1c-27401060b45f" (UID: "6f052070-73e1-409d-be1c-27401060b45f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.855712 4744 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.857813 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f052070-73e1-409d-be1c-27401060b45f-kube-api-access-5kbr5" (OuterVolumeSpecName: "kube-api-access-5kbr5") pod "6f052070-73e1-409d-be1c-27401060b45f" (UID: "6f052070-73e1-409d-be1c-27401060b45f"). InnerVolumeSpecName "kube-api-access-5kbr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.947030 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f052070-73e1-409d-be1c-27401060b45f" (UID: "6f052070-73e1-409d-be1c-27401060b45f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.957576 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kbr5\" (UniqueName: \"kubernetes.io/projected/6f052070-73e1-409d-be1c-27401060b45f-kube-api-access-5kbr5\") on node \"crc\" DevicePath \"\"" Sep 30 04:29:17 crc kubenswrapper[4744]: I0930 04:29:17.957622 4744 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f052070-73e1-409d-be1c-27401060b45f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.184424 4744 generic.go:334] "Generic (PLEG): container finished" podID="6f052070-73e1-409d-be1c-27401060b45f" containerID="58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea" exitCode=0 Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.184672 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqt8s" event={"ID":"6f052070-73e1-409d-be1c-27401060b45f","Type":"ContainerDied","Data":"58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea"} Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.186532 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqt8s" event={"ID":"6f052070-73e1-409d-be1c-27401060b45f","Type":"ContainerDied","Data":"d866c5e4bd192e080fad7df3d9925687118bd941e39ece199cdde3fdd67bde31"} Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.186678 4744 scope.go:117] "RemoveContainer" containerID="58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.184783 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqt8s" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.212983 4744 scope.go:117] "RemoveContainer" containerID="52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.243949 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqt8s"] Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.252350 4744 scope.go:117] "RemoveContainer" containerID="b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.254169 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xqt8s"] Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.311938 4744 scope.go:117] "RemoveContainer" containerID="58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea" Sep 30 04:29:18 crc kubenswrapper[4744]: E0930 04:29:18.315281 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea\": container with ID starting with 58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea not found: ID does not exist" containerID="58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.315334 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea"} err="failed to get container status \"58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea\": rpc error: code = NotFound desc = could not find container \"58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea\": container with ID starting with 58271f8c3b05ac0115ff415cb32f8f6daacf12e7d1601dc3a79b2c156df7f2ea not found: ID does not exist" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.315387 4744 scope.go:117] "RemoveContainer" containerID="52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e" Sep 30 04:29:18 crc kubenswrapper[4744]: E0930 04:29:18.316187 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e\": container with ID starting with 52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e not found: ID does not exist" containerID="52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.316221 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e"} err="failed to get container status \"52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e\": rpc error: code = NotFound desc = could not find container \"52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e\": container with ID starting with 52113c1f60e294e282a0c45d36f3335f1c1a6208f9f31d33bd9b98929ec8712e not found: ID does not exist" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.316242 4744 scope.go:117] "RemoveContainer" containerID="b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788" Sep 30 04:29:18 crc kubenswrapper[4744]: E0930 04:29:18.317041 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788\": container with ID starting with b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788 not found: ID does not exist" containerID="b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788" Sep 30 04:29:18 crc kubenswrapper[4744]: I0930 04:29:18.317078 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788"} err="failed to get container status \"b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788\": rpc error: code = NotFound desc = could not find container \"b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788\": container with ID starting with b85b00480dcb232fb36503e4e2d5f7e02dbb65bbb9dc0b57068966fcf0aea788 not found: ID does not exist" Sep 30 04:29:19 crc kubenswrapper[4744]: I0930 04:29:19.524242 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f052070-73e1-409d-be1c-27401060b45f" path="/var/lib/kubelet/pods/6f052070-73e1-409d-be1c-27401060b45f/volumes" Sep 30 04:29:22 crc kubenswrapper[4744]: I0930 04:29:22.506097 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:29:22 crc kubenswrapper[4744]: E0930 04:29:22.507297 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:29:37 crc kubenswrapper[4744]: I0930 04:29:37.503688 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:29:37 crc kubenswrapper[4744]: E0930 04:29:37.504713 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:29:52 crc kubenswrapper[4744]: I0930 04:29:52.504048 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:29:52 crc kubenswrapper[4744]: E0930 04:29:52.505561 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.163854 4744 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm"] Sep 30 04:30:00 crc kubenswrapper[4744]: E0930 04:30:00.165096 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="registry-server" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165118 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="registry-server" Sep 30 04:30:00 crc kubenswrapper[4744]: E0930 04:30:00.165140 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="extract-utilities" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165153 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="extract-utilities" Sep 30 04:30:00 crc kubenswrapper[4744]: E0930 04:30:00.165191 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="extract-content" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165205 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="extract-content" Sep 30 04:30:00 crc kubenswrapper[4744]: E0930 04:30:00.165230 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="registry-server" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165241 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="registry-server" Sep 30 04:30:00 crc kubenswrapper[4744]: E0930 04:30:00.165269 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="extract-utilities" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165281 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="extract-utilities" Sep 30 04:30:00 crc kubenswrapper[4744]: E0930 04:30:00.165313 4744 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="extract-content" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165325 4744 state_mem.go:107] "Deleted CPUSet assignment" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="extract-content" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165737 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="16994141-0df7-4e89-92b7-cb36e9819e6a" containerName="registry-server" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.165772 4744 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f052070-73e1-409d-be1c-27401060b45f" containerName="registry-server" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.166846 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.171553 4744 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.172044 4744 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.191650 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm"] Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.327558 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357b752-8919-4317-a61e-cd48f602b76f-config-volume\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.327654 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2357b752-8919-4317-a61e-cd48f602b76f-secret-volume\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.327705 4744 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2lc\" (UniqueName: \"kubernetes.io/projected/2357b752-8919-4317-a61e-cd48f602b76f-kube-api-access-7b2lc\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.430824 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357b752-8919-4317-a61e-cd48f602b76f-config-volume\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.430932 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2357b752-8919-4317-a61e-cd48f602b76f-secret-volume\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.430980 4744 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2lc\" (UniqueName: \"kubernetes.io/projected/2357b752-8919-4317-a61e-cd48f602b76f-kube-api-access-7b2lc\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.432091 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357b752-8919-4317-a61e-cd48f602b76f-config-volume\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.444242 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2357b752-8919-4317-a61e-cd48f602b76f-secret-volume\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.456223 4744 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2lc\" (UniqueName: \"kubernetes.io/projected/2357b752-8919-4317-a61e-cd48f602b76f-kube-api-access-7b2lc\") pod \"collect-profiles-29320110-ks9fm\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:00 crc kubenswrapper[4744]: I0930 04:30:00.542416 4744 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:01 crc kubenswrapper[4744]: I0930 04:30:01.011615 4744 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm"] Sep 30 04:30:01 crc kubenswrapper[4744]: W0930 04:30:01.019072 4744 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2357b752_8919_4317_a61e_cd48f602b76f.slice/crio-cb7eddc36c098616e814ca74d0f28544a928cc03bc5122da18e5764234e9f01b WatchSource:0}: Error finding container cb7eddc36c098616e814ca74d0f28544a928cc03bc5122da18e5764234e9f01b: Status 404 returned error can't find the container with id cb7eddc36c098616e814ca74d0f28544a928cc03bc5122da18e5764234e9f01b Sep 30 04:30:01 crc kubenswrapper[4744]: I0930 04:30:01.783855 4744 generic.go:334] "Generic (PLEG): container finished" podID="2357b752-8919-4317-a61e-cd48f602b76f" containerID="f64e566ceae353a072b62316a6bb240502bdd449eda80d7cf9ac219410d05ce6" exitCode=0 Sep 30 04:30:01 crc kubenswrapper[4744]: I0930 04:30:01.783972 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" event={"ID":"2357b752-8919-4317-a61e-cd48f602b76f","Type":"ContainerDied","Data":"f64e566ceae353a072b62316a6bb240502bdd449eda80d7cf9ac219410d05ce6"} Sep 30 04:30:01 crc kubenswrapper[4744]: I0930 04:30:01.784237 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" event={"ID":"2357b752-8919-4317-a61e-cd48f602b76f","Type":"ContainerStarted","Data":"cb7eddc36c098616e814ca74d0f28544a928cc03bc5122da18e5764234e9f01b"} Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.164321 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.193049 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b2lc\" (UniqueName: \"kubernetes.io/projected/2357b752-8919-4317-a61e-cd48f602b76f-kube-api-access-7b2lc\") pod \"2357b752-8919-4317-a61e-cd48f602b76f\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.193104 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357b752-8919-4317-a61e-cd48f602b76f-config-volume\") pod \"2357b752-8919-4317-a61e-cd48f602b76f\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.193202 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2357b752-8919-4317-a61e-cd48f602b76f-secret-volume\") pod \"2357b752-8919-4317-a61e-cd48f602b76f\" (UID: \"2357b752-8919-4317-a61e-cd48f602b76f\") " Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.194449 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2357b752-8919-4317-a61e-cd48f602b76f-config-volume" (OuterVolumeSpecName: "config-volume") pod "2357b752-8919-4317-a61e-cd48f602b76f" (UID: "2357b752-8919-4317-a61e-cd48f602b76f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.208785 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2357b752-8919-4317-a61e-cd48f602b76f-kube-api-access-7b2lc" (OuterVolumeSpecName: "kube-api-access-7b2lc") pod "2357b752-8919-4317-a61e-cd48f602b76f" (UID: "2357b752-8919-4317-a61e-cd48f602b76f"). InnerVolumeSpecName "kube-api-access-7b2lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.208973 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2357b752-8919-4317-a61e-cd48f602b76f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2357b752-8919-4317-a61e-cd48f602b76f" (UID: "2357b752-8919-4317-a61e-cd48f602b76f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.295083 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b2lc\" (UniqueName: \"kubernetes.io/projected/2357b752-8919-4317-a61e-cd48f602b76f-kube-api-access-7b2lc\") on node \"crc\" DevicePath \"\"" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.295118 4744 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357b752-8919-4317-a61e-cd48f602b76f-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.295127 4744 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2357b752-8919-4317-a61e-cd48f602b76f-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.810664 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" event={"ID":"2357b752-8919-4317-a61e-cd48f602b76f","Type":"ContainerDied","Data":"cb7eddc36c098616e814ca74d0f28544a928cc03bc5122da18e5764234e9f01b"} Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.810966 4744 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb7eddc36c098616e814ca74d0f28544a928cc03bc5122da18e5764234e9f01b" Sep 30 04:30:03 crc kubenswrapper[4744]: I0930 04:30:03.810774 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320110-ks9fm" Sep 30 04:30:04 crc kubenswrapper[4744]: I0930 04:30:04.286008 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg"] Sep 30 04:30:04 crc kubenswrapper[4744]: I0930 04:30:04.299471 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320065-vm9mg"] Sep 30 04:30:04 crc kubenswrapper[4744]: I0930 04:30:04.503831 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:30:04 crc kubenswrapper[4744]: E0930 04:30:04.504857 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:30:05 crc kubenswrapper[4744]: I0930 04:30:05.527430 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f9e2978-a3a1-4545-9e15-47bb36d54b25" path="/var/lib/kubelet/pods/2f9e2978-a3a1-4545-9e15-47bb36d54b25/volumes" Sep 30 04:30:14 crc kubenswrapper[4744]: I0930 04:30:14.957093 4744 generic.go:334] "Generic (PLEG): container finished" podID="c175e32d-a72c-42cd-979b-9ced3a98d7f8" containerID="af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5" exitCode=0 Sep 30 04:30:14 crc kubenswrapper[4744]: I0930 04:30:14.957281 4744 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8km5/must-gather-b5qvd" event={"ID":"c175e32d-a72c-42cd-979b-9ced3a98d7f8","Type":"ContainerDied","Data":"af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5"} Sep 30 04:30:14 crc kubenswrapper[4744]: I0930 04:30:14.959340 4744 scope.go:117] "RemoveContainer" containerID="af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5" Sep 30 04:30:15 crc kubenswrapper[4744]: I0930 04:30:15.640681 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l8km5_must-gather-b5qvd_c175e32d-a72c-42cd-979b-9ced3a98d7f8/gather/0.log" Sep 30 04:30:17 crc kubenswrapper[4744]: I0930 04:30:17.504667 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:30:17 crc kubenswrapper[4744]: E0930 04:30:17.505551 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.412590 4744 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l8km5/must-gather-b5qvd"] Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.413187 4744 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-l8km5/must-gather-b5qvd" podUID="c175e32d-a72c-42cd-979b-9ced3a98d7f8" containerName="copy" containerID="cri-o://1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023" gracePeriod=2 Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.423961 4744 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l8km5/must-gather-b5qvd"] Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.819434 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l8km5_must-gather-b5qvd_c175e32d-a72c-42cd-979b-9ced3a98d7f8/copy/0.log" Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.819978 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.940168 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c175e32d-a72c-42cd-979b-9ced3a98d7f8-must-gather-output\") pod \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.940538 4744 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f996\" (UniqueName: \"kubernetes.io/projected/c175e32d-a72c-42cd-979b-9ced3a98d7f8-kube-api-access-5f996\") pod \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\" (UID: \"c175e32d-a72c-42cd-979b-9ced3a98d7f8\") " Sep 30 04:30:28 crc kubenswrapper[4744]: I0930 04:30:28.953586 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c175e32d-a72c-42cd-979b-9ced3a98d7f8-kube-api-access-5f996" (OuterVolumeSpecName: "kube-api-access-5f996") pod "c175e32d-a72c-42cd-979b-9ced3a98d7f8" (UID: "c175e32d-a72c-42cd-979b-9ced3a98d7f8"). InnerVolumeSpecName "kube-api-access-5f996". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.042540 4744 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f996\" (UniqueName: \"kubernetes.io/projected/c175e32d-a72c-42cd-979b-9ced3a98d7f8-kube-api-access-5f996\") on node \"crc\" DevicePath \"\"" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.135688 4744 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c175e32d-a72c-42cd-979b-9ced3a98d7f8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c175e32d-a72c-42cd-979b-9ced3a98d7f8" (UID: "c175e32d-a72c-42cd-979b-9ced3a98d7f8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.147427 4744 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c175e32d-a72c-42cd-979b-9ced3a98d7f8-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.148516 4744 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l8km5_must-gather-b5qvd_c175e32d-a72c-42cd-979b-9ced3a98d7f8/copy/0.log" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.149039 4744 generic.go:334] "Generic (PLEG): container finished" podID="c175e32d-a72c-42cd-979b-9ced3a98d7f8" containerID="1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023" exitCode=143 Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.149100 4744 scope.go:117] "RemoveContainer" containerID="1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.149250 4744 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8km5/must-gather-b5qvd" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.207637 4744 scope.go:117] "RemoveContainer" containerID="af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.281878 4744 scope.go:117] "RemoveContainer" containerID="1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023" Sep 30 04:30:29 crc kubenswrapper[4744]: E0930 04:30:29.282385 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023\": container with ID starting with 1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023 not found: ID does not exist" containerID="1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.282431 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023"} err="failed to get container status \"1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023\": rpc error: code = NotFound desc = could not find container \"1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023\": container with ID starting with 1ef2c912855884c41ccacf1fecd046bd98d0bea5a3c8b757a90c823c34006023 not found: ID does not exist" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.282456 4744 scope.go:117] "RemoveContainer" containerID="af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5" Sep 30 04:30:29 crc kubenswrapper[4744]: E0930 04:30:29.282828 4744 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5\": container with ID starting with af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5 not found: ID does not exist" containerID="af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.282848 4744 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5"} err="failed to get container status \"af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5\": rpc error: code = NotFound desc = could not find container \"af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5\": container with ID starting with af61425b0a10a28baada87145615ba489c633c39cff653f27ae2d013338bc7c5 not found: ID does not exist" Sep 30 04:30:29 crc kubenswrapper[4744]: I0930 04:30:29.519451 4744 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c175e32d-a72c-42cd-979b-9ced3a98d7f8" path="/var/lib/kubelet/pods/c175e32d-a72c-42cd-979b-9ced3a98d7f8/volumes" Sep 30 04:30:32 crc kubenswrapper[4744]: I0930 04:30:32.503724 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:30:32 crc kubenswrapper[4744]: E0930 04:30:32.504731 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:30:43 crc kubenswrapper[4744]: I0930 04:30:43.917329 4744 scope.go:117] "RemoveContainer" containerID="2748e8b8124ff1700e64b2613f1d046e63ca719d72503d6581a124cba97faa6e" Sep 30 04:30:45 crc kubenswrapper[4744]: I0930 04:30:45.503741 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:30:45 crc kubenswrapper[4744]: E0930 04:30:45.504320 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:30:56 crc kubenswrapper[4744]: I0930 04:30:56.507756 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:30:56 crc kubenswrapper[4744]: E0930 04:30:56.508797 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:31:08 crc kubenswrapper[4744]: I0930 04:31:08.503581 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:31:08 crc kubenswrapper[4744]: E0930 04:31:08.504668 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:31:22 crc kubenswrapper[4744]: I0930 04:31:22.504395 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:31:22 crc kubenswrapper[4744]: E0930 04:31:22.505221 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:31:33 crc kubenswrapper[4744]: I0930 04:31:33.518607 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:31:33 crc kubenswrapper[4744]: E0930 04:31:33.521185 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:31:48 crc kubenswrapper[4744]: I0930 04:31:48.504378 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:31:48 crc kubenswrapper[4744]: E0930 04:31:48.505715 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:32:00 crc kubenswrapper[4744]: I0930 04:32:00.504098 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:32:00 crc kubenswrapper[4744]: E0930 04:32:00.504988 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:32:11 crc kubenswrapper[4744]: I0930 04:32:11.504489 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:32:11 crc kubenswrapper[4744]: E0930 04:32:11.505896 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10" Sep 30 04:32:24 crc kubenswrapper[4744]: I0930 04:32:24.504604 4744 scope.go:117] "RemoveContainer" containerID="3d7571369f1481decd2d9f0dee7df481c38e5843a646121ad389bcad2ca88b15" Sep 30 04:32:24 crc kubenswrapper[4744]: E0930 04:32:24.506494 4744 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kp8zv_openshift-machine-config-operator(a0ffb258-115f-4a60-92da-91d4a9036c10)\"" pod="openshift-machine-config-operator/machine-config-daemon-kp8zv" podUID="a0ffb258-115f-4a60-92da-91d4a9036c10"